bentoctl is a CLI tool for deploying your BentoML packaged ML models as API endpoint on popular cloud platforms. It automates Bento docker image build, interactes with cloud platform APIs, and allow users to easily manage their deployment.
- Supports major cloud providers: AWS, Azure, Google Cloud, and more.
- Easy to deploy, update and operate cloud deployments.
- Optimized for CI/CD workflow
- Extensible with custom operators.
- AWS EC2
- AWS Lambda
- AWS SageMaker
- Azure Functions
- Azure Container Instances
- Google Cloud Run
- Google Compute Engine
- Heroku
pip install --pre bentoctl
| 💡 bentoctl is in pre-release stage, use the --pre
to install the pre-release version.
- Quickstart Guide walks through a series of steps to deploy a bento to AWS Lambda as API server.
- Core Concepts explains the core concepts in bentoctl.
- Operator List lists official operators and their current status.
- To report a bug or suggest a feature request, use GitHub Issues.
- For other discussions, use Github Discussions under the BentoML repo
- To receive release announcements and get support, join us on Slack.
There are many ways to contribute to the project:
- Create and share new operators. Use deployment operator template to get started.
- If you have any feedback on the project, share it with the community in Github Discussions under the BentoML repo.
- Report issues you're facing and "Thumbs up" on issues and feature requests that are relevant to you.
- Investigate bugs and reviewing other developer's pull requests.