meta-llama/llama3

No module named 'termios'

Opened this issue · 4 comments

Describe the bug

llama-toolchain python package

Minimal reproducible example

Installing with pip install llama-toolchain
Running llama or any other parameter results in a crash with No module named 'termios'

Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "C:\Users\Demon\AppData\Roaming\Python\Python312\Scripts\llama.exe\__main__.py", line 4, in <module>
  File "C:\Users\Demon\AppData\Roaming\Python\Python312\site-packages\llama_toolchain\cli\llama.py", line 11, in <module>
    from .stack import StackParser
  File "C:\Users\Demon\AppData\Roaming\Python\Python312\site-packages\llama_toolchain\cli\stack\__init__.py", line 7, in <module>
    from .stack import StackParser  # noqa
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\Demon\AppData\Roaming\Python\Python312\site-packages\llama_toolchain\cli\stack\stack.py", line 12, in <module>
    from .configure import StackConfigure
  File "C:\Users\Demon\AppData\Roaming\Python\Python312\site-packages\llama_toolchain\cli\stack\configure.py", line 19, in <module>
    from llama_toolchain.common.exec import run_with_pty
  File "C:\Users\Demon\AppData\Roaming\Python\Python312\site-packages\llama_toolchain\common\exec.py", line 9, in <module>
    import pty
  File "C:\Program Files\Python312\Lib\pty.py", line 12, in <module>
    import tty
  File "C:\Program Files\Python312\Lib\tty.py", line 5, in <module>
    from termios import *
ModuleNotFoundError: No module named 'termios'

Runtime Environment

  • Model: N/A
  • Using via huggingface?: no
  • OS: Windows
  • GPU VRAM: 8
  • Number of GPUs: 1
  • GPU Make: Nvidia

I have the same issue too, please help

I have the same issue too

commented

Guys I found out way to fix this problem
You will need to install 1 more thing:
pip install llama-stack

pip install llama-stack
Thanks for sharing! I tested it, and it works perfectly. Appreciate the help!