the OpenAI translation interface have 500 Internal Server Error.
Closed this issue · 2 comments
jbw772713376 commented
What happened?
When visiting the OpenAI translation interface, an error occurred: 500 Internal Server Error. But when using raycast ai chat, using the same models, it can complete normally.
What did you expect to happen?
The raycast translator, it can respond successfully.
raycast_api_proxy version
docker image ls
REPOSITORY TAG IMAGE ID CREATED SIZE
ghcr.io/yufeikang/raycast_api_proxy main ac78dc5f3797 7 days ago 200MB
raycast_api_proxy env
"CERT_FILE=/data/cert/backend.raycast.com.cert.pem",
"CERT_KEY=/data/cert/backend.raycast.com.key.pem",
"LOG_LEVEL=INFO",
"OPENAI_API_KEY=sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
"OPENAI_BASE_URL=https://api.luee.net/v1",
"PATH=/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
"LANG=C.UTF-8",
"GPG_KEY=A035C8C19219BA821ECEA86B64E628F8D684696D",
"PYTHON_VERSION=3.10.14",
"PYTHON_PIP_VERSION=23.0.1",
"PYTHON_SETUPTOOLS_VERSION=65.5.1",
"PYTHON_GET_PIP_URL=https://github.com/pypa/get-pip/raw/dbf0c85f76fb6e1ab42aa672ffca6f0a675d9ee4/public/get-pip.py",
"PYTHON_GET_PIP_SHA256=dfe9fd5c28dc98b5ac17979a953ea550cec37ae1b47a5116007395bfacff2ab9",
"PYTHONPATH=/project/pkgs"
raycast_api_proxy log
2024-07-05 03:00:19,545 MainThread main.py :109 INFO : Received request to /api/v1/ai/models
2024-07-05 03:00:19,546 MainThread utils.py :95 INFO : Received request: GET https://backend.raycast.com/api/v1/ai/models
2024-07-05 03:00:19,553 MainThread main.py :72 INFO : Received request to /api/v1/me
2024-07-05 03:00:19,554 MainThread utils.py :95 INFO : Received request: GET https://backend.raycast.com/api/v1/me
INFO: 183.134.211.52:33371 - "GET /api/v1/ai/models HTTP/1.1" 200 OK
INFO: 183.134.211.52:29061 - "GET /api/v1/me HTTP/1.1" 200 OK
INFO: 183.134.211.52:24589 - "POST /api/v1/translations HTTP/1.1" 500 Internal Server Error
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "/project/pkgs/anyio/streams/memory.py", line 94, in receive
return self.receive_nowait()
File "/project/pkgs/anyio/streams/memory.py", line 89, in receive_nowait
raise WouldBlock
anyio.WouldBlock
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/project/pkgs/starlette/middleware/base.py", line 78, in call_next
message = await recv_stream.receive()
File "/project/pkgs/anyio/streams/memory.py", line 114, in receive
raise EndOfStream
anyio.EndOfStream
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/project/pkgs/uvicorn/protocols/http/httptools_impl.py", line 399, in run_asgi
result = await app( # type: ignore[func-returns-value]
File "/project/pkgs/uvicorn/middleware/proxy_headers.py", line 70, in __call__
return await self.app(scope, receive, send)
File "/project/pkgs/fastapi/applications.py", line 276, in __call__
await super().__call__(scope, receive, send)
File "/project/pkgs/starlette/applications.py", line 122, in __call__
await self.middleware_stack(scope, receive, send)
File "/project/pkgs/starlette/middleware/errors.py", line 184, in __call__
raise exc
File "/project/pkgs/starlette/middleware/errors.py", line 162, in __call__
await self.app(scope, receive, _send)
File "/project/pkgs/starlette/middleware/base.py", line 108, in __call__
response = await self.dispatch_func(request, call_next)
File "/project/app/middleware.py", line 55, in dispatch
response = await call_next(request)
File "/project/pkgs/starlette/middleware/base.py", line 84, in call_next
raise app_exc
File "/project/pkgs/starlette/middleware/base.py", line 70, in coro
await self.app(scope, receive_or_disconnect, send_no_error)
File "/project/pkgs/starlette/middleware/exceptions.py", line 79, in __call__
raise exc
File "/project/pkgs/starlette/middleware/exceptions.py", line 68, in __call__
await self.app(scope, receive, sender)
File "/project/pkgs/fastapi/middleware/asyncexitstack.py", line 21, in __call__
raise e
File "/project/pkgs/fastapi/middleware/asyncexitstack.py", line 18, in __call__
await self.app(scope, receive, send)
File "/project/pkgs/starlette/routing.py", line 718, in __call__
await route.handle(scope, receive, send)
File "/project/pkgs/starlette/routing.py", line 276, in handle
await self.app(scope, receive, send)
File "/project/pkgs/starlette/routing.py", line 66, in app
response = await func(request)
File "/project/pkgs/fastapi/routing.py", line 237, in app
raw_response = await run_endpoint_function(
File "/project/pkgs/fastapi/routing.py", line 163, in run_endpoint_function
return await dependant.call(**values)
File "/project/app/main.py", line 59, in proxy_translations
async for content in get_bot(model_name).translate_completions(
File "/project/app/models.py", line 244, in translate_completions
async for choice, error in self.__chat(messages, model=model, temperature=0.8):
File "/project/app/models.py", line 280, in __chat
yield chunk.choices[0], None
IndexError: list index out of range
openai proxy
yufeikang commented
I couldn't determine the cause from the current logs, so I've added more logging details. Could you please test it again? Thanks! 😊
jbw772713376 commented
The issue has been resolved after I updated using an image with the ID db664d8f03f9.
Thx!