niieani/gpt-tokenizer
JavaScript BPE Tokenizer Encoder Decoder for OpenAI's GPT-2 / GPT-3 / GPT-4. Port of OpenAI's tiktoken with additional features.
TypeScriptMIT
Issues
- 0
Support New o200k_base
#42 opened by RedwindA - 0
Dependency Dashboard
#14 opened by renovate - 2
ChatMessage[] TypeScript warning
#36 opened by thdoan - 2
Support function calls?
#21 opened by seoker - 4
avoid expensive initialization
#18 opened by mimoo - 0
Picture tokens
#41 opened by arthurwolf - 1
the `isWithinTokenLimit` may be broken
#39 opened by zhouhan760503 - 0
vercel/pkg Error with gpt-tokenizer
#40 opened by Ranork - 0
- 2
Error: Model name must be provided either during initialization or passed in to the method
#37 opened by thdoan - 0
Huge memory consumsation of isWithinTokenLimit
#35 opened by aminsol - 0
How to set the model in encode / encode generator like encodeChat or encodeChatGenerator?
#34 opened by MiniSuperDev - 16
encodeChatGenerator undefined?
#15 opened by congqi09 - 1
- 0
Error in NextJS edge function: Attempted import error: './index.js' does not contain a default export (imported as 'cjs')
#30 opened by gablabelle - 5
- 0
- 0
Error: Invariant: Method expects to have requestAsyncStorage, none available
#27 opened by chenqilin70 - 2
This module is not ready for CJK characters
#16 opened by mashihua - 2
- 1
- 0
- 0
- 10
Calculated tokens much higher than actual
#6 opened by Qarj - 2
comparison to other tokenizers
#11 opened by transitive-bullshit - 1
Loading with script tags doesn't work
#12 opened by baobabKoodaa - 3
How can I get prompts_tokens?
#10 opened by magelikescoke - 1
- 3
Which encoding is this using?
#5 opened by ricardomatias - 2
CDN version is broken
#7 opened by alejandro5042 - 4