runpod-workers/worker-vllm

OpenAI Error: Not returning full output

Closed this issue · 1 comments

Specify the max_tokens parameter in your input