dvcrn/chatgpt-ui

docker run error

ccomkhj opened this issue · 3 comments

Thank you for the nice work!
I faced a problem in running via docker.

I followed the setup procedure

# Clone the project
git clone https://github.com/dvcrn/chatgpt-ui
cd chatgpt-ui

# Install Elixir
mix deps.get

# Install node dependencies
yarn install      # or npm install
mix assets.setup
mix assets.build

# Start the dev server
export OPENAI_API_KEY=<api-key>
export OPENAI_ORGANIZATION_KEY=<org-key>
mix phx.server

and successfully running.

but when I follow the docker procedure,

docker build . -t chatgpt
docker run -e SECRET_KEY_BASE=<keybase here> -e HOST="localhost" -p 4000:4000 chatgpt

it's stuck at

05:30:11.447 [info] Running ChatgptWeb.Endpoint with cowboy 2.9.0 at :::4000 (http)
05:30:11.447 [info] Access ChatgptWeb.Endpoint at https://localhost
05:30:11.447 [info] downloading tokenizer model: bert-base-multilingual-uncased

Is there any missing step?

if I build docker in "dev" mode, I face the error below when I run
docker run -e SECRET_KEY_BASE=<keybase here> -e HOST="localhost" -p 4000:4000 chatgpt

=CRASH REPORT==== 27-Apr-2023::05:39:59.435250 ===
  crasher:
    initial call: application_master:init/4
    pid: <0.2061.0>
    registered_name: []
    exception exit: {{shutdown,
                      {failed_to_start_child,kernel_safe_sup,
                       {on_load_function_failed,'Elixir.Tokenizers.Native',
                        {error,
                         {load_failed,
                          "Failed to load NIF library: 'Error loading shared library ld-linux-x86-64.so.2: No such file or directory (needed by /home/elixir/app/lib/tokenizers-0.3.0/priv/native/libex_tokenizers-v0.3.0-nif-2.16-x86_64-unknown-linux-gnu.so)'"}}}}},
                     {kernel,start,[normal,[]]}}
      in function  application_master:init/4 (application_master.erl, line 142)
    ancestors: [<0.2060.0>]
    message_queue_len: 1
    messages: [{'EXIT',<0.2062.0>,normal}]
    links: [<0.2060.0>,<0.2059.0>]
    dictionary: []
dvcrn commented

Looks to be a problem with the tokenizer

05:30:11.447 [info] downloading tokenizer model: bert-base-multilingual-uncased

This is when the Tokenizer downloads the model for tokenization from huggingface, but it's strange that this doesn't work from within the container, but does from the outside

Are you running the containers on x86-64? What's the host architecture?

build and run the container on x86-64 @dvcrn