Extending to also support OpenAI?
Closed this issue · 2 comments
NicolaiSchmid commented
You currently use the transformers library to access the models.
But except for falcon, most models of HF lack either a commercial licence or in performance.
Would you be open to also support calling into the OpenAI library (Azure & non-Azure) and utilizing the logit_bias attribute of the OpenAI API.
It's a feature that I need in my current workflow
r2d4 commented
Happy to support the OpenAI models. The one problem is that the ChatGPT tokenizer vocabulary isn't public, so we don't know the full vocabulary. That's necessary because ReLLM/ParserLLM needs to filter from the full set of tokens. I'm not exactly sure what the answer here is yet, but open to suggestions.