Example Starting Code for LLM Serving with Aviary
This will only work on Anyscale.
To run this on an Anyscale workspace, use the following steps:
- Clone this repo to a workspace:
git clone https://github.com/anyscale/example-llmserving-aviary .
- Change directories into the aviary folder:
cd Aviary_Backend_Deployment
- Run the sample code:
serve run aviary.backend:llm_application models="models/"
The backend can also be deployed to an Anyscale Service. Use the following steps:
- Define a service yaml definition file
- Deploy the service: `anyscale service rollout -f <yaml_definition.yaml>
- Test the service.
Additional information is available in the onboarding documentation.