/vllm_ubiops_deployment

A simple deployment package to run vLLM inference server on UbiOps

Primary LanguagePython

Watchers