modal-labs/modal-examples

How do I set stop_token_id in vllm_inference.py?

Closed this issue · 1 comments

I would like to create a dataset using Magpie, a method for creating synthetic datasets, but I don't know how to set stop_token_id. Could you please provide me with the source code of how to do this?

Hello there!

We reserve the Issues section on this GitHub repo for problems with the examples themselves, rather than for support with using the examples. For that, you can ask in our community Slack.

But FYI, you can set stop token IDs in vLLM with sampling params.