serp-ai/ChatLLaMA-and-ChatGPT-Desktop-App

No requirements.txt for pip modules

bradgillap opened this issue · 2 comments

Hello,

I tried to install this but I'm getting hung up on the import modules. A requirements.txt file might allow this process to be sped up.

I'm having some trouble resolving this dependency despite having pip installed for qt5.

  File "D:\github\ChatLLaMA-and-ChatGPT-Desktop-App\launch_gui.py", line 29, in <module>
    from PyQt5.QtWebEngineWidgets import QWebEngineView
ModuleNotFoundError: No module named 'PyQt5.QtWebEngineWidgets'

I was able to get past this by running:
pip install PyQtWebEngine

Currently stumped by this error:

Traceback (most recent call last):
File "/media/user/src-repos/serp-ai/ChatLLaMA-and-ChatGPT-Desktop-App/launch_gui.py", line 32, in
from .assistant import OpenAIAssistant, LocalAssistant
ImportError: attempted relative import with no known parent package

By modifying the import statements to remove the leading periods i've been able to get a step further and am now stuck on this error:

  File "/media/user/src-repos/serp-ai/ChatLLaMA-and-ChatGPT-Desktop-App/launch_gui.py", line 1525, in <module>
    floating_icon = FloatingIcon(chat_config=config['chat_config'], text2audio_api_key=config['text2audio_api_key'], text2audio_voice=config['text2audio_voice'], wolfram_app_id=config['wolfram_app_id'], mode=config['mode'])
  File "/media/user/src-repos/serp-ai/ChatLLaMA-and-ChatGPT-Desktop-App/launch_gui.py", line 65, in __init__
    self.chat_dialog = ChatDialog(text2audio=TTSElevenlabs(api_key=text2audio_api_key) if text2audio_api_key != None else None, text2audio_voice='Jarvis' if text2audio_voice is None else text2audio_voice, assistant=LocalAssistant(memory_manager=MemoryManager(), **chat_config), wolfram_app_id=wolfram_app_id, mode='local')
  File "/media/user/src-repos/serp-ai/ChatLLaMA-and-ChatGPT-Desktop-App/assistant.py", line 479, in __init__
    from quantization.utils.llama_wrapper import LlamaClass
  File "/media/user/src-repos/serp-ai/ChatLLaMA-and-ChatGPT-Desktop-App/quantization/utils/llama_wrapper.py", line 5, in <module>
    from transformers import LlamaForCausalLM
ImportError: cannot import name 'LlamaForCausalLM' from 'transformers' (/home/user/.local/lib/python3.10/site-packages/transformers/__init__.py)

Having a hard time finding out where the LlamaForCausalLM can be pulled.