niieani/gpt-tokenizer

Support function calls?

Opened this issue · 3 comments

seoker commented

As you may know, function calls are now supported by OpenAI, and the function call tokens will be taken into account.
With some googling, I found the calculation here.

It will be great if the library could also calculate the required tokens when using with function calls. 🙏🏼

One related issue is the type signature of ChatMessage. Depending on whether there is functional calling, the ChatMessage may have either the content field or the function_call field, but not both. The current typing in gpt-tokenizer/src/GptEncoding.ts will need an update.

I saw that https://github.com/hmarr/openai-chat-tokens already does it - would it be fine to import that package here to provide that functionality?

PRs welcome! I would prefer not to pull in an external dependency, and it looks like the code isn't too complex.