Automate Your Daily Tasks with Python: 5 Practical Scripts
rajneesh
4 min read
- python
As we know, Python is a very easy-to-understand and simple language by which we can easily create scripts which helps us in our daily life tasks.
In this post, I will provide you five script codes by which you can easily automate your daily tasks with one click. python has a very large number of libraries and modules, making it easy to create any Python script with just a few lines of code. without westing any time let start over blog post.
1. Automate Email Sending
Corresponding with emails takes a lot of time, we can fix this problem we can using Python's smtplib and email library. its allows you to automatical emails by which you can easiy automate sending daily reports, reminders or notifications.
import smtplib
from email.mime.text import MIMEText
from email.mime.multipart import MIMEMultipart
def send_email(subject, body, to_email, from_email="youremail@example.com", password="yourpassword"):
try:
# Create the email
msg = MIMEMultipart()
msg['From'] = from_email
msg['To'] = to_email
msg['Subject'] = subject
msg.attach(MIMEText(body, 'plain'))
# Connect to the server and send the email
with smtplib.SMTP('smtp.example.com', 587) as server:
server.starttls()
server.login(from_email, password)
server.send_message(msg)
print(f"Email sent successfully to {to_email}")
except smtplib.SMTPException as e:
print(f"Failed to send email: {e}")
# Example Usage
send_email(
subject="Daily Update",
body="Here's your daily update!",
to_email="recipient@example.com"
)
Key Features:
Automates email.
Customizable subject and body.
Integrates with various email services.
2. Web Scraping for Data Collection
data collection for websites manually very headic process we can automate this using two simple python libraries BeautifulSoup
and requests
.
python can gather information automatically in on click.
import requests
from bs4 import BeautifulSoup
def scrape_website(url):
try:
response = requests.get(url)
response.raise_for_status()
soup = BeautifulSoup(response.content, 'html.parser')
# Example: Extract all the headlines
headlines = soup.find_all('h2')
if headlines:
for headline in headlines:
print(headline.get_text())
else:
print("No headlines found on the page.")
except requests.exceptions.RequestException as e:
print(f"Error fetching data from {url}: {e}")
# Example Usage
scrape_website("https://example.com/news")
Key Features:
automatically extracts data from web pages.
easily parse HTML content.
we can used to gather data from multiple pages.
3. Automate File Organization
organizing files manually very headic process. Python can help you automatically move files based on their extensions(file type).
import os
import shutil
def organize_files(directory):
try:
if not os.path.exists(directory):
print(f"Directory {directory} does not exist.")
return
extensions = {
'TextFiles': ['.txt', '.docx'],
'Images': ['.jpg', '.png', '.gif'],
'PDFs': ['.pdf']
}
for folder, ext_list in extensions.items():
folder_path = os.path.join(directory, folder)
if not os.path.exists(folder_path):
os.makedirs(folder_path)
for filename in os.listdir(directory):
if os.path.isfile(os.path.join(directory, filename)):
file_ext = os.path.splitext(filename)[1]
if file_ext in ext_list:
shutil.move(os.path.join(directory, filename), folder_path)
print(f"Moved {filename} to {folder}")
except Exception as e:
print(f"Error organizing files: {e}")
# Example Usage
organize_files('/path/to/directory')
Key Features:
Automatically move files into folders accoring to their type.
Customizable for different file types.
4. Schedule Tasks with a Task Scheduler
by using the schedule library you can schedule any task according to required time:
import schedule
import time
def job():
print("Task running...")
def run_scheduler():
try:
schedule.every().day.at("10:00").do(job)
print("Task scheduled to run every day at 10:00.")
while True:
schedule.run_pending()
time.sleep(1)
except Exception as e:
print(f"Error in scheduling task: {e}")
# Example Usage
run_scheduler()
5. Automate Data Backup
something we need to take regular backup of any spacific file to do that we can use python script to automate this by this script it wil automatical take backup of your provided file it your provided location:
import shutil
import os
from datetime import datetime
def backup_files(src_directory, backup_directory):
try:
if not os.path.exists(src_directory):
print(f"Source directory {src_directory} does not exist.")
return
if not os.path.exists(backup_directory):
os.makedirs(backup_directory)
backup_folder = os.path.join(backup_directory, datetime.now().strftime("%Y%m%d_%H%M%S"))
os.makedirs(backup_folder)
for filename in os.listdir(src_directory):
file_path = os.path.join(src_directory, filename)
if os.path.isfile(file_path):
shutil.copy2(file_path, backup_folder)
print(f"Backed up {filename} to {backup_folder}")
except Exception as e:
print(f"Error backing up files: {e}")
# Example Usage
backup_files('/path/to/data', '/path/to/backup')
Key Features:
Automatically copies files to a backup location.
Conclusion
i have covered most of the common problem accur in over daily life and try to solve that using python scripts. in this post i have covered 5 basic problem solution using the python module. how you can automete email sending, how you can automete data gether for any site, how you can manage file according to their file types, how you can schedule any task according to you and how you can automete backup for any file. if you think i missed any daily life problem that can be automated by python you can comment it i will also add that script i this code 😎.