run-llama/sec-insights

Error: Unable To Load Chat Response - openai.BadRequestError

CamDuffy1 opened this issue · 0 comments

I've noticed an occasional error where the front-end conversation interface cannot load a chat response. I am running the application locally from Codespace using the latest version of the main branch. The error seems to be repeatable based on the specific wording of prompts:

2024-03-01 - openai BadRequestError - GUI - 02

When this happens, I see a corresponding openai.BadRequestError error message on the backend server.

2024-03-01 - openai BadRequestError - CLI