serverless-inference
There are 3 repositories under serverless-inference topic.
ServerlessLLM/ServerlessLLM
Serverless LLM Serving for Everyone.
Picovoice/serverless-picollm
LLM Inference on AWS Lambda
tensorchord/modelz-py
Python SDK and CLI for modelz.ai, which is a developer-first platform for prototyping and deploying machine learning models.