A simple deployment package to run vLLM inference server on UbiOps
Primary LanguagePython
No issues in this repository yet.