opea-project/GenAIExamples

[Bug] Conversational UI instructions are incomplete and does not work

Closed this issue · 4 comments

Priority

P2-High

OS type

Ubuntu

Hardware type

Xeon-GNR

Installation method

  • Pull docker images from hub.docker.com
  • Build docker images from source

Deploy method

  • Docker compose
  • Docker
  • Kubernetes
  • Helm

Running nodes

Single Node

What's the version?

latest

Description

Deployed ChatQnA example in IBM Cloud and trying to configure Conversational UI. The Compose file does not start up and gives the following error:

ubuntu@opea-demo:~/GenAIExamples/ChatQnA/docker_compose/intel/cpu/xeon$ docker compose up -d
WARN[0000] The "no_proxy" variable is not set. Defaulting to a blank string. 
WARN[0000] The "http_proxy" variable is not set. Defaulting to a blank string. 
WARN[0000] The "https_proxy" variable is not set. Defaulting to a blank string. 
WARN[0000] The "no_proxy" variable is not set. Defaulting to a blank string. 
WARN[0000] The "http_proxy" variable is not set. Defaulting to a blank string. 
WARN[0000] The "https_proxy" variable is not set. Defaulting to a blank string. 
WARN[0000] The "no_proxy" variable is not set. Defaulting to a blank string. 
WARN[0000] The "http_proxy" variable is not set. Defaulting to a blank string. 
WARN[0000] The "https_proxy" variable is not set. Defaulting to a blank string. 
WARN[0000] The "BACKEND_SERVICE_ENDPOINT" variable is not set. Defaulting to a blank string. 
WARN[0000] The "DATAPREP_SERVICE_ENDPOINT" variable is not set. Defaulting to a blank string. 
WARN[0000] The "no_proxy" variable is not set. Defaulting to a blank string. 
WARN[0000] The "https_proxy" variable is not set. Defaulting to a blank string. 
WARN[0000] The "http_proxy" variable is not set. Defaulting to a blank string. 
WARN[0000] The "no_proxy" variable is not set. Defaulting to a blank string. 
WARN[0000] The "http_proxy" variable is not set. Defaulting to a blank string. 
WARN[0000] The "https_proxy" variable is not set. Defaulting to a blank string. 
WARN[0000] The "http_proxy" variable is not set. Defaulting to a blank string. 
WARN[0000] The "https_proxy" variable is not set. Defaulting to a blank string. 
WARN[0000] The "no_proxy" variable is not set. Defaulting to a blank string. 
WARN[0000] The "no_proxy" variable is not set. Defaulting to a blank string. 
WARN[0000] The "https_proxy" variable is not set. Defaulting to a blank string. 
WARN[0000] The "http_proxy" variable is not set. Defaulting to a blank string. 
WARN[0000] The "LOGFLAG" variable is not set. Defaulting to a blank string. 
service "chatqna-xeon-nginx-server" depends on undefined service "chatqna-xeon-ui-server": invalid compose project

Seems like there are more steps required to configure this.

Reproduce steps

Follow the steps here to deploy anywhere https://github.com/opea-project/GenAIExamples/tree/main/ChatQnA

Raw log

No response

Hi Arun, in the doc there are parts for aiding environment variable setup. Not sure if it is missed or the settings are not taking effect.

Error message is complaining:

WARN[0000] The "BACKEND_SERVICE_ENDPOINT" variable is not set. Defaulting to a blank string. 
WARN[0000] The "DATAPREP_SERVICE_ENDPOINT" variable is not set. Defaulting to a blank string. 

set_env.sh at https://github.com/opea-project/GenAIExamples/blob/main/ChatQnA/docker_compose/intel/cpu/xeon/set_env.sh does not have those environment variables defined. It only has:

export EMBEDDING_MODEL_ID="BAAI/bge-base-en-v1.5"
export RERANK_MODEL_ID="BAAI/bge-reranker-base"
export LLM_MODEL_ID="Intel/neural-chat-7b-v3-3"
export INDEX_NAME="rag-redis"

Hi Arun, there was an env variable update PR merged a few days ago, in which the variable is not needed in the yaml file. Would you try again with latest commit?

Closing the issue since no update for a while. Arun, please re-open it if the issue is still reproduced with latest commit, thanks.