nomic-ai/pygpt4all

GPT4ALL J not loading, read length error

beingPurple opened this issue · 1 comments

I tried to load the new GPT4ALL-J model using pyllamacpp, but it refused to load. Running pyllamacpp-convert-gpt4all gets the following issue:

C:\Users...\pyllamacpp\scripts\convert.py", line 78, in read_tokens
f_in.read(length)
ValueError: read length must be non-negative or -1

Hi @beingPurple,

The llama.cpp project does not yet support gpt-j models, so pyllamacpp won't load the GPT4ALL-J model.
You can try now with pygpt4all.