WongSaang/chatgpt-ui

response speed of chatgpt ui

Closed this issue · 21 comments

from @cgofun

The response speed of the chat is very slow and the real-time performance is poor. It takes several tens of seconds for the AI to reply. The API request speed test is not slow. What could be the reason for this?

I have the same observation. The speed of chatgpt ui generating the text is alot slower than other similar projects on github.

I also light to point out that if the generated text is massive, there is a tendency that the text will drop out and cause a connecton closed error.

srs会增加大量的内存占用,检查你的机器内存是不是爆了

Hello

I have more than enough resources running docker.

image
image
image

cgofun commented

I am also experiencing slow AI response times, not due to resource issues. The UI loads quickly, but the AI response time is very slow, not just a little slow. It can take several tens of seconds or even longer, which makes exploration difficult.

尝试kill node server/index.mjs 这个进程试试,目前发现这个进程可能会有内存泄漏现象

尝试kill node server/index.mjs 这个进程试试,目前发现这个进程可能会有内存泄漏现象

image

Do u mean like this? Did i execute the commands correctly. According to chatgpt i did =D

I executed the commands in the chatgpt-ui-client

kill this ,这个进程会自动重启,看能不能解决这个问题

I asked chatGPT and got this result. =D

image

cgofun commented

kill this ,这个进程会自动重启,看能不能解决这个问题

杀这个进程试过了,没有什么变化,刚刚装好就是这么慢。感觉应该是wsgi这个容器问题,这容器里怎么常用命令都没有的。对话开始后,是由wsgi这个向openai接口发起请求对吧,这个请求响应看起来有点慢。

This issue may be caused by node's proxy, which previously used nginx to proxy backend requests.

🤪Please help to test!
Temporarily switch the client image to this wongsaang/chatgpt-ui-client:test to see if it can solve this issue.

🤪Please help to test! Temporarily switch the client image to this wongsaang/chatgpt-ui-client:test to see if it can solve this issue.

hello. the client is still slow

cgofun commented

Around 8am, just before I sent this message, I tested the conversation and the response time was within a few seconds, which appeared to be normal. This suggests that it may be related to a busy period of the OpenAI official interface. My server is only being used by me, but it still feels slower compared to other clients that I have deployed. I think this may be related to the effectiveness of the AI's responses, as they appear almost instantly instead of dynamically appearing like a typing machine.

cgofun commented
ChatGPT.UI.mp4

VS

Chatbot.UI.mp4

VS

If I understand correctly, Chatbot UI should be pure front-end, directly requesting openai's API. However, for the purpose of implementing multiple users, this project added a server-side layer. This extra layer means that it may be a bit slower than Chatbot UI.

cgofun commented

The name of each conversation is "Untitled Conversation", why?

The name of each conversation is "Untitled Conversation", why?

Do you have any better suggestions?

cgofun commented

The name of each conversation is "Untitled Conversation", why?

Do you have any better suggestions?

Actually, this work is already excellent, and the response speed is much better than before. Regarding the session title, I just wanted to offer a suggestion that generating an automatic summary title, like ChatGPT, would be better. Additionally, it would be ideal if the "frugal mode" could be controlled by a backend parameter to determine whether it is enabled or not.

+1 for disable by default frugal mode :)

+1 for disable by default frugal mode :)

You can now turn it off in the admin pannel
https://wongsaang.github.io/chatgpt-ui/guide/configuration.html#frugal-mode-control

+1 for disable by default frugal mode :)

You can now turn it off in the admin pannel https://wongsaang.github.io/chatgpt-ui/guide/configuration.html#frugal-mode-control

Oh thanks !!!! :)