/bentoctl

CLI tool for deploying ML models in the cloud - built on top of BentoML 🚀

Primary LanguagePythonApache License 2.0Apache-2.0

bentoctl

Fast model deployment with BentoML on cloud platforms

PyPI GitHub branch checks state Codecov


bentoctl is a CLI tool for deploying your BentoML packaged ML models as API endpoint on popular cloud platforms. It automates Bento docker image build, interactes with cloud platform APIs, and allow users to easily manage their deployment.

Features:

  • Supports major cloud providers: AWS, Azure, Google Cloud, and more.
  • Easy to deploy, update and operate cloud deployments.
  • Optimized for CI/CD workflow
  • Extensible with custom operators.
demo of bentoctl deploying to AWS-EC2

Supported Platforms:

Install bentoctl

pip install --pre bentoctl

| 💡 bentoctl is in pre-release stage, use the --pre to install the pre-release version.

Next steps

  • Quickstart Guide walks through a series of steps to deploy a bento to AWS Lambda as API server.
  • Core Concepts explains the core concepts in bentoctl.
  • Operator List lists official operators and their current status.

Community

Contributing

There are many ways to contribute to the project:

  • Create and share new operators. Use deployment operator template to get started.
  • If you have any feedback on the project, share it with the community in Github Discussions under the BentoML repo.
  • Report issues you're facing and "Thumbs up" on issues and feature requests that are relevant to you.
  • Investigate bugs and reviewing other developer's pull requests.