/recipes

Common recipes to run vLLM

Apache License 2.0Apache-2.0

This repo intends to host community maintained common recipes to run vLLM answering the question: How do I run model X on hardware Y for task Z?

Guides

DeepSeek DeepSeek

Ernie Ernie

GLM GLM

InternLM InternLM

Llama

OpenAI OpenAI

Qwen Qwen

Seed Seed

Contributing

Please feel free to contribute by adding a new recipe or improving an existing one, just send us a PR!

While the repo is designed to be directly viewable in GitHub (Markdown files as first citizen), you can build the docs as web pages locally.

uv venv
source .venv/bin/activate
uv pip install -r requirements.txt
uv run mkdocs serve

License

This project is licensed under the Apache License 2.0 - see the LICENSE file for details.