Fast model deployment on any cloud đ
Plese see the latest BentoML documentation on OCI-container based deployment workflow: https://docs.bentoml.com/
bentoctl helps deploy any machine learning models as production-ready API endpoints on the cloud, supporting AWS SageMaker, AWS Lambda, EC2, Google Compute Engine, Azure, Heroku and more.
đ Join our Slack community today!
⨠Looking deploy your ML service quickly? You can checkout BentoML Cloud for the easiest and fastest way to deploy your bento. It's a full featured, serverless environment with a model repository and built in monitoring and logging.
There are many ways to contribute to the project: