A modular template for scraping data from the web to send yourself scheduled email reports
The repository is set up in a modular fashion so you can easily substitute in your own email address and report contents. I have personally found this to be a fun use for custom web scrapers that grab information off the internet and add it to an email report I have sent to myself every morning.
email_me.py
- This is the executable file that will initiate the message composition and send the email.
message
folder:
compose_message.py
- This file calls the message modules and composes the text of the message.game_scores.py
- Example module: this is a web scraper module. It acquires baseball scores from games played last night. Look here for more info on the contents.
countdowns.py
- Example module: this is a series of simple countdowns. Look here for more info on the contents.
To get the report up and running there are a few steps you need to follow:
email_me.py
compose_message.py
file and run automatically.message
folder.compose_message.py
fileemail_me.py
will show you the output and help with spacing, newlines etc.compose_message.py
whenever you like, so you can easily change the report over time.python email_me.py
crontab -e
0 6 * * 1-5 python /Users/Cam/bin/email_me.py
* * * * * python ./email_me.py
- - - - -
| | | | |
| | | | +----- day of week (0 - 6) (Sunday=0)
| | | +------- month (1 - 12)
| | +--------- day of month (1 - 31)
| +----------- hour (0 - 23)
+------------- min (0 - 59)
Thats it! A simple template for a custom email report. Feel free to take this and use it yourself! I would love to hear about any interesting scrapers or data aggregation modules that you construct and run in this template.
The example web scraper game_scores.py
uses the beautiful soup package. Besides that the repository is all python3 standard library.
You can download beautiful soup using: pip install bs4