Groq llama3 model works with errors
Closed this issue · 6 comments
Basically, I'm experiencing similar issue to the one described in #84 (in particular #84 (comment)). I'm using v3.22 downloaded from Google Play
After every response I get an error message. The response from API is fine, so the model works.
Issue | Same chat, next message | API url | Model | Event log (not sure if it's relevant) |
---|---|---|---|---|
Full chat log: c23975d31601afd39aae3feb0d1e95498b77dafaff4a0b1213e2635bbf67ef94.json
Crash log is empty. Ads log is not relevant I think here.
It looks like speak-gpt app adds this text somewhere here, I guess, so it's not an issue with configuration or the API, it's an issue with error handling logic inside speak-gpt app. It fires when it shouldn't, I think.
As you can see this message appends only when exception is happened. It means that library used in SpeakGPT is not fully compatible with other endpoints. The only thing I can do is to add ability users can hide error messages.
only thing I can do is to add ability users can hide error messages.
I think this is an acceptable solution for now. After all the model works just fine, it's just UI that is annoying at the moment.
library used in SpeakGPT is not fully compatible with other endpoints
Should I create an issue with the library used? If yes, please point me in the right direction.
Users dont see such errors when using official OpenAI endpoint.
Starting from SpeakGPT 3.23 you can disable error messages. Usually it takes 1-2 business days before Play Store version becomes available.