niieani/gpt-tokenizer

Error: Model name must be provided either during initialization or passed in to the method

Closed this issue · 2 comments

I'm using code from the instructions:

import { encodeChat } from 'gpt-tokenizer';
...
// Example chat:
const chat = [
  { role: 'system', content: 'You are a helpful assistant.' },
  { role: 'assistant', content: 'gpt-tokenizer is awesome.' },
];

// Encode chat into tokens
const chatTokens = encodeChat(chat);

However, this is throwing this error:

gpt-tokenizer.js?v=5e943746:656 Uncaught (in promise) Error: Model name must be provided either during initialization or passed in to the method.
    at _GptEncoding.encodeChatGenerator (gpt-tokenizer.js?v=5e943746:656:13)
    at encodeChatGenerator.next (<anonymous>)
    at _GptEncoding.encodeChat (gpt-tokenizer.js?v=5e943746:698:21)

According to the instructions, "By default, importing from gpt-tokenizer uses cl100k_base encoding, used by gpt-3.5-turbo and gpt-4", so I'm not sure why I'm getting this error.

This fixed it: encodeChat(messages, 'gpt-3.5-turbo')

the isWithinTokenLimit may be broken

running

import { isWithinTokenLimit } from 'gpt-tokenizer';

isWithinTokenLimit(
      lastMessage.message,
      1000,
    );

throws Model name must be provided either during initialization or passed in to the method

for

"gpt-tokenizer": "^2.1.2",

cc @thdoan