niieani/gpt-tokenizer

the `isWithinTokenLimit` may be broken

Closed this issue · 1 comments

          the `isWithinTokenLimit` may be broken 

running

import { isWithinTokenLimit } from 'gpt-tokenizer';

isWithinTokenLimit(
      lastMessage.message,
      1000,
    );

throws Model name must be provided either during initialization or passed in to the method

for

"gpt-tokenizer": "^2.1.2",

cc @thdoan

Originally posted by @michaelfarrell76 in https://github.sheincorp.cn/niieani/gpt-tokenizer/issues/37#issuecomment-1778736225

import { isWithinTokenLimit } from "gpt-tokenizer/model/gpt-3.5-turbo";

I got the same issue. You will need to specify the model when importing functions.