infiniflow/ragflow

[Bug]: Cannot register API models (OpenAI, Azure OpenAI) from within the company network

Closed this issue · 2 comments

Is there an existing issue for the same bug?

  • I have checked the existing issues.

RAGFlow workspace code commit ID

a88a184

RAGFlow image version

1b817a5(v0.14.1~62) slim

Other environment information

No response

Actual behavior

I trying to set up a ragflow container within our company network according to the readme.
I have rewritten the nginx expose port and added proxy and no_proxy settings to the .env file to get to the outside network, since nginx was already running on the same server.

.env

~~~~~~~~~~~~
EXPOSE_NGINX_PORT=50080
EXPOSE_NGINX_SSL_PORT=50443

http_proxy=http://cache.xxx.xx.xx:1234
https_proxy=http://cache.xxx.xx.xx:1234
HTTP_PROXY=http://cache.xxx.xx.xx:1234
HTTPS_PROXY=http://cache.xxx.xx.xx:1234
no_proxy=localhost,127.0.0.1,...,ragflow,redis,es01,minio,mysql,infinity,kibana,kibana-user-init
NO_PROXY=localhost,127.0.0.1,...,ragflow,redis,es01,minio,mysql,infinity,kibana,kibana-user-init

docker-compose.yaml

~~~~~~~~~~~~~
services:
  ragflow:
    depends_on:
      mysql:
        condition: service_healthy
    image: ${RAGFLOW_IMAGE}
    container_name: ragflow-server
    ports:
      - ${SVR_HTTP_PORT}:9380
      - ${EXPOSE_NGINX_PORT}:80
      - ${EXPOSE_NGINX_SSL_PORT}:443
~~~~~~~~~~~~~

The container started up without any problems, and the registration of the self-hosted xinference model worked, but API models such as OpenAI and Azure OpenAI could not be registered.
Is there some other setting I need to do?

Expected behavior

Ability to register models from within the company network

Steps to reproduce

After modifying .env and docker-compose.yaml as above,
Start the container according to the readme

Additional information

Container logs in question

2024-12-05 13:30:25,545 INFO     18 172.16.41.6 - - [05/Dec/2024 13:30:25] "GET /v1/user/info HTTP/1.1" 200 -
2024-12-05 13:30:25,550 INFO     18 172.16.41.6 - - [05/Dec/2024 13:30:25] "GET /v1/llm/my_llms HTTP/1.1" 200 -
2024-12-05 13:30:25,556 INFO     18 172.16.41.6 - - [05/Dec/2024 13:30:25] "GET /v1/user/tenant_info HTTP/1.1" 200 -
2024-12-05 13:30:25,588 INFO     18 172.16.41.6 - - [05/Dec/2024 13:30:25] "GET /v1/llm/factories HTTP/1.1" 200 -
2024-12-05 13:30:31,939 INFO     18 Retrying request to /chat/completions in 0.871602 seconds
2024-12-05 13:30:32,818 INFO     18 Retrying request to /chat/completions in 1.506051 seconds
2024-12-05 13:30:34,334 INFO     18 172.16.41.6 - - [05/Dec/2024 13:30:34] "POST /v1/llm/add_llm HTTP/1.1" 200 -

I'm not quite sure how to set proxy in docker container. But this should not be miss.
image

Accessing OpenAI API in banned regions with HTTPS_PROXY env is disallowed by the OpenAI.
RAGFlow team will not to try to support such use case.