? An end-to-end ML applications using PyTorch, W&B, FastAPI, Docker, Streamlit and Heroku → https://e2e-ml-app-pytorch.herokuapp.com/ (may take few minutes to spin up occasionally).
? This project was created using the Made With ML boilerplate template. Check it out to start creating your own ML applications.
virtualenv -p python3.6 venv
source venv/bin/activate
pip install -r requirements.txt
pip install torch==1.4.0
python text_classification/utils.py
python text_classification/train.py \
--data-url https://raw.githubusercontent.com/madewithml/lessons/master/data/news.csv --lower --shuffle --use-glove
uvicorn text_classification.app:app --host 0.0.0.0 --port 5000 --reload
GOTO: http://localhost:5000/docs
python text_classification/predict.py --text 'The Canadian government officials proposed the new federal law.'
curl "http://localhost:5000/predict" \
-X POST -H "Content-Type: application/json" \
-d '{
"inputs":[
{
"text":"The Wimbledon tennis tournament starts next week!"
},
{
"text":"The Canadian government officials proposed the new federal law."
}
]
}' | json_pp
import json
import requests
headers = {
'Content-Type': 'application/json',
}
data = {
"experiment_id": "latest",
"inputs": [
{
"text": "The Wimbledon tennis tournament starts next week!"
},
{
"text": "The Canadian minister signed in the new federal law."
}
]
}
response = requests.post('http://0.0.0.0:5000/predict',
headers=headers, data=json.dumps(data))
results = json.loads(response.text)
print (json.dumps(results, indent=2, sort_keys=False))
streamlit run text_classification/streamlit.py
GOTO: http://localhost:8501
pytest
docker build -t text-classification:latest -f Dockerfile .
docker run -d -p 5000:5000 -p 6006:6006 --name text-classification text-classification:latest
Set `WANDB_API_KEY` as an environment variable.
text-classification/
├── datasets/ - datasets
├── logs/ - directory of log files
| ├── errors/ - error log
| └── info/ - info log
├── tests/ - unit tests
├── text_classification/ - ml scripts
| ├── app.py - app endpoints
| ├── config.py - configuration
| ├── data.py - data processing
| ├── models.py - model architectures
| ├── predict.py - prediction script
| ├── streamlit.py - streamlit app
| ├── train.py - training script
| └── utils.py - load embeddings and utilities
├── wandb/ - wandb experiment runs
├── .dockerignore - files to ignore on docker
├── .gitignore - files to ignore on git
├── CODE_OF_CONDUCT.md - code of conduct
├── CODEOWNERS - code owner assignments
├── CONTRIBUTING.md - contributing guidelines
├── Dockerfile - dockerfile to containerize app
├── LICENSE - license description
├── logging.json - logger configuration
├── Procfile - process script for Heroku
├── README.md - this README
├── requirements.txt - requirementss
├── setup.sh - streamlit setup for Heroku
└── sweeps.yaml - hyperparameter wandb sweeps config
python text_classification/train.py \
--data-url https://raw.githubusercontent.com/madewithml/lessons/master/data/news.csv --lower --shuffle --data-size 0.1 --num-epochs 3
python text_classification/train.py \
--data-url https://raw.githubusercontent.com/madewithml/lessons/master/data/news.csv --lower --shuffle
python text_classification/train.py \
--data-url https://raw.githubusercontent.com/madewithml/lessons/master/data/news.csv --lower --shuffle --use-glove --freeze-embeddings
python text_classification/train.py \
--data-url https://raw.githubusercontent.com/madewithml/lessons/master/data/news.csv --lower --shuffle --use-glove
End-to-end topics that will be covered in subsequent lessons.
• Build image
docker build -t madewithml:latest -f Dockerfile .
• Run container if using CMD ["python", "app.py"]
or ENTRYPOINT [ "/bin/sh", "entrypoint.sh"]
docker run -p 5000:5000 --name madewithml madewithml:latest
• Get inside container if using CMD ["/bin/bash"]
docker run -p 5000:5000 -it madewithml /bin/bash
• Run container with mounted volume
docker run -p 5000:5000 -v $PWD:/root/madewithml/ --name madewithml madewithml:latest
• Other flags
-d: detached
-ti: interative terminal
• Clean up
docker stop $(docker ps -a -q) # stop all containers
docker rm $(docker ps -a -q) # remove all containers
docker rmi $(docker images -a -q) # remove all images