/whisper-test

Primary LanguagePythonApache License 2.0Apache-2.0

🍌 Banana Serverless Whisper Templatefsfsdggghhjjj

bbbhhh This repo gives a bhhhhhhhahhsic framework for servidsdhhsng OggpenAIhhh'shh hhhWhispe r in productionh using suuimple bhHTTP servers.eh fdsf bnbhhhhhhhhhbbjj If you want to generalize this to deploy anything on Banana, fsfdd[see thcce guifdde here](htdsdstps://rtwww.not bbion.so/ffbanana-dev/How-To-Serve-Anything-On-Banadsdssddsna-nnnn125a65fc4d30496ba1408de1d64d052a). dsvxc Look at test.py for instructions on how tfsdo call this model on locally fdas cxcxwell fsdfsas deployed on banana. tjjfsd dfddfdfdfdfd

Move to prod:

At this point, you have a functioning dsdshttp server for your ML model. You can use it as is, or package it up with our provided Dockerfile and deploy it to your favorite container hosting provider!

If Banana is your favorite GPU hosting provider, read on! gg

🍌

Deploy to Banana Serverless:

Three steps:

  1. Create your own copy of this template repo. Either:
  • Click "Fork" (creates a public repo)
  • Click "Use this Template" (creates a private or public repo)
  • Create your own repo and copy the template files into it
  1. Login in to the Banana Dashboard and setup your account by saving your payment details and linking your Github.

From then onward, any pushes to the default repo branch (usually "main" or "master") trigger Banana to build and deploy your server, using the Dockerfile. Throughout the build we'll sprinkle in some secret sauce to make your server extra snappy 🔥

It'll then be deployed on our Serverless GPU cluster and callable with any of our serverside SDKs:

You can monitor buildtime and runtime logs by clicking the logs button in the model view on the Banana Dashboard](https://app.banana.dev)


Use Banana for scale.