myshell-ai/AIlice

502 Network Error

Closed this issue · 1 comments

(base) PS G:\AIlice> conda activate AIlice
(AIlice) PS G:\AIlice> ailice_web --modelID=lm-studio:TheBloke/Mistral-7B-OpenOrca-GGUF/mistral-7b-openorca.Q5_K_M.gguf --prompt="main" --contextWindowRatio=0.5
config.json is located at C:\Users\29099\AppData\Local\Steven Lu\ailice
In order to simplify installation and usage, we have set local execution as the default behavior, which means AI has complete control over the local environment. To prevent irreversible losses due to potential AI errors, you may consider one of the following two methods: the first one, run AIlice in a virtual machine; the second one, install Docker, use the provided Dockerfile to build an image and container, and modify the relevant configurations in config.json. For detailed instructions, please refer to the documentation.
storage  started.
browser  started.
arxiv  started.
google  started.
duckduckgo  started.
scripter  started.
files  started.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
We now start the vector database. Note that this may include downloading the model weights, so it may take some time.
Vector database has been started. returned msg: vector database has been switched to a non-persistent version. tokenizer: bert-base-uncased, model: nomic-ai/nomic-embed-text-v1
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
Running on local URL:  http://127.0.0.1:7860

To create a public link, set `share=True` in `launch()`.
ASSISTANT_AIlice:  Exception in thread Thread-6:
Traceback (most recent call last):
  File "C:\Program Files\Python310\lib\threading.py", line 1016, in _bootstrap_inner
    self.run()
  File "C:\Program Files\Python310\lib\threading.py", line 953, in run
    self._target(*self._args, **self._kwargs)
  File "G:\AIlice\ailice\core\AProcessor.py", line 83, in __call__
    ret = self.llm.Generate(prompt, proc=partial(self.outputCB, "ASSISTANT_" + self.name), endchecker=self.interpreter.EndChecker, temperature = config.temperature)
  File "G:\AIlice\ailice\core\llm\AModelChatGPT.py", line 26, in Generate
    for chunk in self.client.chat.completions.create(model=self.modelName,
  File "C:\Users\29099\AppData\Roaming\Python\Python310\site-packages\openai\_utils\_utils.py", line 275, in wrapper
    return func(*args, **kwargs)
  File "C:\Users\29099\AppData\Roaming\Python\Python310\site-packages\openai\resources\chat\completions.py", line 663, in create
    return self._post(
  File "C:\Users\29099\AppData\Roaming\Python\Python310\site-packages\openai\_base_client.py", line 1200, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
  File "C:\Users\29099\AppData\Roaming\Python\Python310\site-packages\openai\_base_client.py", line 889, in request
    return self._request(
  File "C:\Users\29099\AppData\Roaming\Python\Python310\site-packages\openai\_base_client.py", line 965, in _request
    return self._retry_request(
  File "C:\Users\29099\AppData\Roaming\Python\Python310\site-packages\openai\_base_client.py", line 1013, in _retry_request
    return self._request(
  File "C:\Users\29099\AppData\Roaming\Python\Python310\site-packages\openai\_base_client.py", line 965, in _request
    return self._retry_request(
  File "C:\Users\29099\AppData\Roaming\Python\Python310\site-packages\openai\_base_client.py", line 1013, in _retry_request
    return self._request(
  File "C:\Users\29099\AppData\Roaming\Python\Python310\site-packages\openai\_base_client.py", line 980, in _request
    raise self._make_status_error_from_response(err.response) from None
openai.InternalServerError: Error code: 502

Proxy problem, problem solved after resetting proxy.