API Related Issue
Opened this issue · 0 comments
Alm-inference-server) (base) jawabreh@ahmad:~/Desktop/CyprusCodes/use-cas$ python main.py --prompt "return me all the data related to flight number RJ60659"
Admin (to chat_manager):
Provide the user with the details from their flight data that they want in provided PROMPT.
Figure out the specific flight details the user wants if needed.
PROMPT:
return me all the data related to flight number RJ60659
Traceback (most recent call last):
File "/home/jawabreh/Desktop/CyprusCodes/use-cas/main.py", line 66, in
main(args.prompt)
File "/home/jawabreh/Desktop/CyprusCodes/use-cas/main.py", line 51, in main
user.initiate_chat(
File "/home/jawabreh/miniconda3/envs/llm-inference-server/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 621, in initiate_chat
self.send(self.generate_init_message(**context), recipient, silent=silent)
File "/home/jawabreh/miniconda3/envs/llm-inference-server/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 398, in send
recipient.receive(message, self, request_reply, silent)
File "/home/jawabreh/miniconda3/envs/llm-inference-server/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 551, in receive
reply = self.generate_reply(messages=self.chat_messages[sender], sender=sender)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/jawabreh/miniconda3/envs/llm-inference-server/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 1193, in generate_reply
final, reply = reply_func(self, messages=messages, sender=sender, config=reply_func_tuple["config"])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/jawabreh/miniconda3/envs/llm-inference-server/lib/python3.11/site-packages/autogen/agentchat/groupchat.py", line 372, in run_chat
speaker = groupchat.select_speaker(speaker, self)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/jawabreh/miniconda3/envs/llm-inference-server/lib/python3.11/site-packages/autogen/agentchat/groupchat.py", line 214, in select_speaker
final, name = selector.generate_oai_reply(context)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/jawabreh/miniconda3/envs/llm-inference-server/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 708, in generate_oai_reply
response = client.create(
^^^^^^^^^^^^^^
File "/home/jawabreh/miniconda3/envs/llm-inference-server/lib/python3.11/site-packages/autogen/oai/client.py", line 278, in create
response = self._completions_create(client, params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/jawabreh/miniconda3/envs/llm-inference-server/lib/python3.11/site-packages/autogen/oai/client.py", line 543, in _completions_create
response = completions.create(**params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/jawabreh/miniconda3/envs/llm-inference-server/lib/python3.11/site-packages/openai/_utils/_utils.py", line 271, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
TypeError: Completions.create() got an unexpected keyword argument 'use_cache'