myshell-ai/AIlice

Error when trying to list current directory content

Closed this issue · 3 comments

Hello,

My first try seems to generate error :

ASSISTANT_AIlice:   !CALL<!|"coder-proxy","file_listing","./current_directory"|!>
ASSISTANT_file_listing:
SYSTEM_AIlice:  Agent file_listing returned:

ASSISTANT_AIlice:   !CALL<!|"coder-proxy","file_listing","./current_directory"|!>
ASSISTANT_file_listing:  SYSTEM_AIlice:  Error code: 400 - {'error': "'messages' array must only contain objects with a 'content' field that is not empty."}EXCEPTION: Error code: 400 - {'error': "'messages' array must only contain objects with a 'content' field that is not empty."}
Traceback (most recent call last):
  File "/home/benda/AIlice/AIlice/ailice/core/AInterpreter.py", line 131, in EvalEntries
    r = self.Eval(script)
        ^^^^^^^^^^^^^^^^^
  File "/home/benda/AIlice/AIlice/ailice/core/AInterpreter.py", line 107, in Eval
    return self.CallWithTextArgs(nodeType, paras)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/benda/AIlice/AIlice/ailice/core/AInterpreter.py", line 86, in CallWithTextArgs
    return action['func'](**paras)
           ^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/benda/AIlice/AIlice/ailice/core/AProcessor.py", line 148, in EvalCall
    resp = f"Agent {agentName} returned: {self.subProcessors[agentName](msg)}"
                                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/benda/AIlice/AIlice/ailice/core/AProcessor.py", line 113, in __call__
    ret = self.llm.Generate(prompt, proc=partial(self.outputCB, "ASSISTANT_" + self.name), endchecker=self.interpreter.EndChecker, temperature = config.temperature)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/benda/AIlice/AIlice/ailice/core/llm/AModelChatGPT.py", line 26, in Generate
    for chunk in self.client.chat.completions.create(model=self.modelName,
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/dist-packages/openai/_utils/_utils.py", line 275, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/dist-packages/openai/resources/chat/completions.py", line 663, in create
    return self._post(
           ^^^^^^^^^^^
  File "/usr/local/lib/python3.12/dist-packages/openai/_base_client.py", line 1200, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/dist-packages/openai/_base_client.py", line 889, in request
    return self._request(
           ^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/dist-packages/openai/_base_client.py", line 980, in _request
    raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'error': "'messages' array must only contain objects with a 'content' field that is not empty."}


ASSISTANT_AIlice:   To list the contents of the current directory, you can use the "ls" command in bash. Please note that this command requires the "coder-proxy" agent type to execute.
!CALL<!|"coder-proxy","file_listing","""
ls
"""|!>
ASSISTANT_file_listing:  SYSTEM_AIlice:  Error code: 400 - {'error': "'messages' array must only contain objects with a 'content' field that is not empty."}EXCEPTION: Error code: 400 - {'error': "'messages' array must only contain objects with a 'content' field that is not empty."}
Traceback (most recent call last):
  File "/home/benda/AIlice/AIlice/ailice/core/AInterpreter.py", line 131, in EvalEntries
    r = self.Eval(script)
        ^^^^^^^^^^^^^^^^^
  File "/home/benda/AIlice/AIlice/ailice/core/AInterpreter.py", line 107, in Eval
    return self.CallWithTextArgs(nodeType, paras)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/benda/AIlice/AIlice/ailice/core/AInterpreter.py", line 86, in CallWithTextArgs
    return action['func'](**paras)
           ^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/benda/AIlice/AIlice/ailice/core/AProcessor.py", line 148, in EvalCall
    resp = f"Agent {agentName} returned: {self.subProcessors[agentName](msg)}"
                                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/benda/AIlice/AIlice/ailice/core/AProcessor.py", line 113, in __call__
    ret = self.llm.Generate(prompt, proc=partial(self.outputCB, "ASSISTANT_" + self.name), endchecker=self.interpreter.EndChecker, temperature = config.temperature)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/benda/AIlice/AIlice/ailice/core/llm/AModelChatGPT.py", line 26, in Generate
    for chunk in self.client.chat.completions.create(model=self.modelName,
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/dist-packages/openai/_utils/_utils.py", line 275, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/dist-packages/openai/resources/chat/completions.py", line 663, in create
    return self._post(
           ^^^^^^^^^^^
  File "/usr/local/lib/python3.12/dist-packages/openai/_base_client.py", line 1200, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/dist-packages/openai/_base_client.py", line 889, in request
    return self._request(
           ^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/dist-packages/openai/_base_client.py", line 980, in _request
    raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'error': "'messages' array must only contain objects with a 'content' field that is not empty."}

Launched with the command line : python3 AIliceWeb.py --modelID=lm-studio:AbLoGa/Chat-Mistral-7b-openorca-q4-gguf

LmStudio log :
LM-Studio.log

This happens when LLM output nothing. already fixed in the latest version. Thanks!

No idea if this is better ... ?

root@ailice:/home/benda/AIlice/AIlice/ailice# python3 AIliceWeb.py --modelID=lm-studio:AbLoGa/Chat-Mistral-7b-openorca-q4-gguf --share=true
config.json is located at /root/.config/ailice
In order to simplify installation and usage, we have set local execution as the default behavior, which means AI has complete control over the local environment. To prevent irreversible losses due to potential AI errors, you may consider one of the following two methods: the first one, run AIlice in a virtual machine; the second one, install Docker, use the provided Dockerfile to build an image and container, and modify the relevant configurations in config.json. For detailed instructions, please refer to the documentation.
storage  started.
browser  started.
arxiv  started.
google  started.
duckduckgo  started.
scripter  started.
computer  started.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
We now start the vector database. Note that this may include downloading the model weights, so it may take some time.
Vector database has been started. returned msg: vector database has been switched to a non-persistent version. tokenizer: bert-base-uncased, model: nomic-ai/nomic-embed-text-v1
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
Running on local URL:  http://127.0.0.1:7860
IMPORTANT: You are using gradio version 4.19.2, however version 4.29.0 is available, please upgrade.
--------
Running on public URL: https://44bece524680185594.gradio.live

This share link expires in 72 hours. For free permanent hosting and GPU upgrades, run `gradio deploy` from Terminal to deploy to Spaces (https://huggingface.co/spaces)

ASSISTANT_AIlice:   !CALL<!|"coder-proxy","file_listing","./current_directory"|!>
ASSISTANT_file_listing:
SYSTEM_AIlice:  Agent file_listing returned:

ASSISTANT_AIlice:   !CALL<!|"coder-proxy","file_listing","./current_directory"|!>
ASSISTANT_file_listing:
SYSTEM_AIlice:  Agent file_listing returned:

ASSISTANT_AIlice:   !CALL<!|"coder-proxy","file_listing","./current_directory"|!>
ASSISTANT_file_listing:
SYSTEM_AIlice:  Agent file_listing returned:

ASSISTANT_AIlice:   !CALL<!|"coder-proxy","file_listing","./current_directory"|!>
ASSISTANT_file_listing:
SYSTEM_AIlice:  Agent file_listing returned:

ASSISTANT_AIlice:   !CALL<!|"coder-proxy","file_listing","./current_directory"|!>
ASSISTANT_file_listing:  ^C

LS Studio Logs :
LM-Studio.log

Yes, this is the expected outcome after the fix.

The 7B model used to be able to complete some simple tasks, but as AIlice's design has become more complex, these models are no longer adequate and often perform poorly. Therefore, we recommend using a better model to replace it. In the future, we will consider fine-tuning a 7B model specifically for AIlice's application to see if it can achieve the desired results.

Additionally, if based on hf:Open-Orca/Mistral-7B-OpenOrca, the performance might be slightly better. However, it still remains impractical. Currently, achieving practical results would require GPT-4o, and the best open-source model available(as I know) is Mixtral-8x22b-instruct, but it still cannot execute tasks as smoothly as GPT-4o