bentoctl
is a CLI tool for deploying your machine-learning models to any cloud platforms. It built on top of BentoML: the unified model serving framework, and makes it easy to bring any BentoML packaged model to production.
👉 Pop into our Slack community! We're happy to help with any issue you face or even just to meet you and hear what you're working on :)
- Supports major cloud providers: AWS, Azure, Google Cloud, and more.
- Easy to deploy, update and reproduce model deployments.
- First class integration with Terraform.
- Optimized for CI/CD workflow.
- Extensible with custom operators.
- High performance serving powered by BentoML
- AWS Lambda
- AWS SageMaker
- AWS EC2
- Google Cloud Run
- Google Compute Engine
- Azure Container Instances
- Heroku
- Looking for Kubernetes? Try out Yatai: Model deployment at scale on Kubernetes.
- Customize deploy target by creating bentoctl plugin from the deployment operator template.
Upcoming:
pip install bentoctl
| 💡 bentoctl designed to work with BentoML version 1.0.0 and above. For BentoML 0.13 or below, you can use the pre-v1.0
branch in the operator repositories and follow the instruction in the README. You can also check out the quickstart guide for 0.13 here.
- Quickstart Guide walks through a series of steps to deploy a bento to AWS Lambda as API server.
- Core Concepts explains the core concepts in bentoctl.
- Operator List lists official operators and their current status.
- To report a bug or suggest a feature request, use GitHub Issues.
- For other discussions, use Github Discussions under the BentoML repo
- To receive release announcements and get support, join us on Slack.
There are many ways to contribute to the project:
- Create and share new operators. Use deployment operator template to get started.
- If you have any feedback on the project, share it with the community in Github Discussions under the BentoML repo.
- Report issues you're facing and "Thumbs up" on issues and feature requests that are relevant to you.
- Investigate bugs and reviewing other developer's pull requests.
BentoML and bentoctl collects usage data that helps our team to
improve the product. Only bentoctl's CLI commands calls are being reported. We
strip out as much potentially sensitive information as possible, and we will
never collect user code, model data, model names, or stack traces. Here's the
code for usage tracking. You can opt-out of
usage tracking by setting environment variable BENTOML_DO_NOT_TRACK=True
:
export BENTOML_DO_NOT_TRACK=True