[BUG] The chat.ask gives the output extremely slowly
UluuGashim048 opened this issue · 5 comments
I am sending only one request, and the chat.ask gives the output only after 2-3 minutes, what might the issue ?
Thank you!
Not a bug, or an issue. Speed is related to load on their backend servers. Your internet. Etc.
But it works quickly with a different PyChatGPT package with the same credentials @rawandahmad698
@UluuGashim048 does the other package spit out the entire result all at once, or does it stream the result word by word?
@UluuGashim048 does the other package spit out the entire result all at once, or does it stream the result word by word?
It's not handled as a text-stream in the code. So it's all at once. Speed is very good on my end. So it must be his internet. Or region.
@rawandahmad698 Fair enough I hadn't actually checked if the response was streamed or not and kinda assumed. I did previously notice slower results via the library and I was getting much faster results in the my browser at the same time for the same query. However this could just be random chance... unless it is somehow also related to the shaddowban.... (I think there should be a better term for that, perhaps shaddow-limited).