Automated testing and deployment of a simple Flask-based (RESTful) micro-service to a production-like environment on AWS, using Docker containers and Travis-CI.
The purpose of this project is to demonstrate how to automate the testing and deployment of a simple Flask-based (RESTful) micro-service to a production-like environment on AWS. The deployment pipeline is handled by Travis-CI, that has been granted access to this GitHub repository and configured to run upon a pull request or a merge to the master branch. The pipeline is defined in the .travis.yaml
file and consists of the following steps:
Pipenv
package using pip
;Pipenv
to install the project dependencies defined in Pipfile.lock
;pipenv run python -m unittest tests/*.py
; and,master
branch - e.g. if a pull request has been merged - then start Docker and run the deploy_to_aws.py
script.The deploy_to_aws.py
script defines the deployment process, which performs the following steps without any manual intervention:
It is reliant on the definition of three environment variables: AWS_ACCESS_KEY_ID
, AWS_SECRET_ACCESS_KEY
, and AWS_REGION
. For security reasons, these are kept out of the .travis.yml
and are instead defined using the Travis-CI UI.
Although the micro-service used in this example - as defined in microservice/api.py
module - only returns a simple message upon a simple GET
request, it could just as easily be a Machine Learning (ML) model-scoring service that receives the values of feature variables and returns a prediction - the overall pattern is the same.
Currently, the initial setup of the required AWS infrastructure is entirely manual (although this could also be scripted in the future). What's required, is an ECS cluster that is capable hosting multiple groups of Docker containers (or 'tasks' - i.e. web applications or in our case just a single micro-service), that sit behind a load balances that accepts incoming traffic and routes it to different containers in the cluster. Collectively,this constitutes a 'service' that is highly available. At a high-level, the steps required to setup this infrastructure using the AWS management console, are as follows (assuming the existence of a repository in ECR, containing our docker image):
t2.medium
;
/microservice
, otherwise it won't get 200s and and will try to re-register hosts;daemon
mode - i.e. assume there is only one container per-task;We use pipenv for managing project dependencies and Python environments (i.e. virtual environments). All of the direct packages dependencies required to run the code (e.g. docker and boto3), as well as all the packages used during development (e.g. IPython for interactive console sessions), are described in the Pipfile
. Their precise downstream dependencies are described in Pipfile.lock
.
To get started with Pipenv, first of all download it - assuming that there is a global version of Python available on your system and on the PATH, then this can be achieved by running the following command,
pip3 install pipenv
Pipenv is also available to install from many non-Python package managers. For example, on OS X it can be installed using the Homebrew package manager, with the following terminal command,
brew install pipenv
For more information, including advanced configuration options, see the official pipenv documentation.
Make sure that you're in the project's root directory (the same one in which Pipfile
resides), and then run,
pipenv install --dev
This will install all of the direct project dependencies as well as the development dependencies (the latter a consequence of the --dev
flag).
In order to continue development in a Python environment that precisely mimics the one the project was initially developed with, use Pipenv from the command line as follows,
pipenv run python3
The python3
command could just as well be ipython3
or the Jupyter notebook server, for example,
pipenv run jupyter notebook
This will fire-up a Jupyter notebook server where the default Python 3 kernel includes all of the direct and development project dependencies. This is how we advise that the notebooks within this project are used.
All test have been written using the unittest package from the Python standard library. Tests are kept in the tests
folder and can be run from the command line by - e.g. by evoking,
pipenv run python -m unittest tests/test_*.py
This can be started via the command line, from the root directory using,
pipenv run python -m microservice.api
Which will start the server at http://localhost:5000/microservice
.