DataBassGit/AgentForge

Better Error Handling for Encoder models

DataBassGit opened this issue · 1 comments

When too many requests are sent to an encoder model, the system can sometimes respond with an error. We should add error handling for these situations so that the work can continue.

Traceback (most recent call last):
File "C:\GitKraken\BigBoogaAGI\Utilities\chroma_utils.py", line 192, in save_status
self.collection.update(
File "C:\Users\josmith\Anaconda3\lib\site-packages\chromadb\api\models\Collection.py", line 271, in update
embeddings = self._embedding_function(documents)
File "C:\Users\josmith\Anaconda3\lib\site-packages\chromadb\utils\embedding_functions.py", line 39, in call
for result in self._client.create(
File "C:\Users\josmith\Anaconda3\lib\site-packages\openai\api_resources\embedding.py", line 33, in create
response = super().create(*args, **kwargs)
File "C:\Users\josmith\Anaconda3\lib\site-packages\openai\api_resources\abstract\engine_api_resource.py", line 153, in create
response, _, api_key = requestor.request(
File "C:\Users\josmith\Anaconda3\lib\site-packages\openai\api_requestor.py", line 226, in request
resp, got_stream = self._interpret_response(result, stream)
File "C:\Users\josmith\Anaconda3\lib\site-packages\openai\api_requestor.py", line 620, in _interpret_response
self._interpret_response_line(
File "C:\Users\josmith\Anaconda3\lib\site-packages\openai\api_requestor.py", line 683, in _interpret_response_line
raise self.handle_error_response(
openai.error.RateLimitError: The server is currently overloaded with other requests. Sorry about that! You can retry your request, or contact us through our help center at help.openai.com if the error p
ersists.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "C:\GitKraken\BigBoogaAGI\salience.py", line 39, in
statusAgent.run_status_agent(data)
File "C:\GitKraken\BigBoogaAGI\Agents\status_agent.py", line 49, in run_status_agent
self.save_status(status, task_id, task_desc, task_order)
File "C:\GitKraken\BigBoogaAGI\Agents\status_agent.py", line 119, in save_status
self.storage.save_status(status, task_id, text, task_order)
File "C:\GitKraken\BigBoogaAGI\Utilities\chroma_utils.py", line 203, in save_status
raise ValueError(f"\n\nError saving status. Error: {e}")
ValueError:

Error saving status. Error: The server is currently overloaded with other requests. Sorry about that! You can retry your request, or contact us through our help center at help.openai.com if the error pe
rsists.

Most recent merge should solve this.