ParisNeo/lollms-webui

docker will not start

airdogvan opened this issue · 10 comments

Log:

docker compose up
[+] Running 1/0
⠿ Container lollms-webui-webui-1 Created 0.0s
Attaching to lollms-webui-webui-1
lollms-webui-webui-1 | Welcome! It seems this is your first use of the new lollms app.
lollms-webui-webui-1 | To make it clear where your data are stored, we now give the user the choice where to put its data.
lollms-webui-webui-1 | This allows you to mutualize models which are heavy, between multiple lollms compatible apps.
lollms-webui-webui-1 | You can change this at any tome using the lollms-settings script or by simply change the content of the global_paths_cfg.yaml file.
lollms-webui-webui-1 | Please provide a folder to store your configurations files, your models and your personal data (database, custom personalities etc).
lollms-webui-webui-1 | Folder path: (/root/Documents/lollms):Traceback (most recent call last):
lollms-webui-webui-1 | File "/srv/app.py", line 1381, in
lollms-webui-webui-1 | lollms_paths = LollmsPaths.find_paths(force_local=True, custom_default_cfg_path="configs/config.yaml")
lollms-webui-webui-1 | File "/usr/local/lib/python3.10/site-packages/lollms/paths.py", line 200, in find_paths
lollms-webui-webui-1 | cfg.lollms_personal_path = input(f"Folder path: ({cfg.lollms_personal_path}):")
lollms-webui-webui-1 | EOFError: EOF when reading a line
lollms-webui-webui-1 exited with code 1

On Ubuntu 22 LTS

same problem

meaning not supported?

Hi. I think the problem is the interactive parts of the code. I'll try to figure out a way. For now, the new install scripts seem to work fine.
https://youtu.be/OKoXTHXgY28

nice, but the problem with this approach is that it "contaminates" the host you install it on. By that I mean the script installs lots of stuff that not only might not be needed for other things but also might get in the way (such as the latest version of python for example).
That's why I definitively would prefer to use docker which keeps everything nicely compartmentalized. My computer runs a lot of stuff,
I might try this on a VM but then no way to use cuda...
Thanks for the feedback anyway, much appreciated...

The v3 uses a conda based install that keeps all libraries in the same folder and doesn't contaminate your local python install or local conda install.

thank you for the feedback, much appreciated!!!!

no luck so far, I seem to have ownership problems, probably because I used root (seeing on the video that the script installed lots of stuff) to install which caused problems when I tried to run as normal user. From what you're writing I assume root is not needed?

Anyway let me try a fresh install (in VM again) and I'll let you know.

ok fresh install in VM Ubuntu 22. Too long to paste so I include attachment.
xxx.txt

You never want to install anything as root. Besides this tool was created to be safely installable even if you have no sudo or root privileges whatsoever. The new installation script is self contained. if you are under linux use linux_install.sh and it should install all you need.
As of docket, I need to remove interactivity for it to work.

zba commented

i found simple workaround, just run it using
docker compose run
instead of "up" but Docker not usable out of the box, for sure, for example configurator ask to keep data in /root/Documents/lollms but, compose not configured to mount this directory, so all data placed there will be lost, also it finally fail, in this place:

Traceback (most recent call last):
  File "/srv/app.py", line 1774, in <module>
    shutil.copy(default_user_avatar, user_avatar_path)
  File "/usr/local/lib/python3.10/shutil.py", line 417, in copy
    copyfile(src, dst, follow_symlinks=follow_symlinks)
  File "/usr/local/lib/python3.10/shutil.py", line 254, in copyfile
    with open(src, 'rb') as fsrc:
FileNotFoundError: [Errno 2] No such file or directory: '/srv/assets/default_user.svg'

I think currently Docker unsupported official de facto.

@ParisNeo docker builds better to check not in working directory :)

aospan commented

Duplicate of #300