Canner/WrenAI

SSL Certificate Verification Error during Initialization

Closed this issue · 2 comments

Bug Description
When starting the Wren AI service, it fails to verify the SSL certificate for the OpenAI API, resulting in an APIConnectionError. Disabling SSL verification temporarily with the -k option in curl allows the service to connect successfully, confirming that the issue is related to SSL certificate verification.

Steps to Reproduce

  1. Start the Wren AI service.
  2. Observe the error logs indicating an SSL certificate verification failure.
  3. Run the following command inside the container terminal to test the connection:
    curl -v -k https://api.openai.com/v1/models -H "Authorization: Bearer YOUR_API_KEY"
    Notice that the command runs successfully with SSL verification disabled.

Error Logs

2024-07-18 10:32:46 INFO: Started server process [7]
2024-07-18 10:32:46 INFO: Waiting for application startup.
2024-07-18 10:32:46 2024-07-18 05:02:46,830 - wren-ai-service - INFO - Initializing providers... (utils.py:64)
2024-07-18 10:32:48 2024-07-18 05:02:48,879 - wren-ai-service - INFO - Registering provider: openai_embedder (loader.py:64)
2024-07-18 10:32:48 2024-07-18 05:02:48,879 - wren-ai-service - INFO - Registering provider: qdrant (loader.py:64)
2024-07-18 10:32:48 2024-07-18 05:02:48,881 - wren-ai-service - INFO - Registering provider: azure_openai_embedder (loader.py:64)
2024-07-18 10:32:48 2024-07-18 05:02:48,883 - wren-ai-service - INFO - Registering provider: ollama_embedder (loader.py:64)
2024-07-18 10:32:48 2024-07-18 05:02:48,884 - wren-ai-service - INFO - Registering provider: wren_ui (loader.py:64)
2024-07-18 10:32:48 2024-07-18 05:02:48,884 - wren-ai-service - INFO - Registering provider: wren_ibis (loader.py:64)
2024-07-18 10:32:48 2024-07-18 05:02:48,892 - wren-ai-service - INFO - Registering provider: azure_openai_llm (loader.py:64)
2024-07-18 10:32:48 2024-07-18 05:02:48,895 - wren-ai-service - INFO - Registering provider: ollama_llm (loader.py:64)
2024-07-18 10:32:48 2024-07-18 05:02:48,896 - wren-ai-service - INFO - Registering provider: openai_llm (loader.py:64)
2024-07-18 10:32:48 2024-07-18 05:02:48,897 - wren-ai-service - INFO - Using OpenAILLM provider with API base: https://api.openai.com/v1 (openai.py:135)
2024-07-18 10:32:52 ERROR: Traceback (most recent call last):
2024-07-18 10:32:52 File "/app/.venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 69, in map_httpcore_exceptions
2024-07-18 10:32:52 yield
2024-07-18 10:32:52 File "/app/.venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 233, in handle_request
2024-07-18 10:32:52 resp = self._pool.handle_request(req)
2024-07-18 10:32:52 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-07-18 10:32:52 File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection_pool.py", line 216, in handle_request
2024-07-18 10:32:52 raise exc from None
2024-07-18 10:32:52 File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection_pool.py", line 196, in handle_request
2024-07-18 10:32:52 response = connection.handle_request(
2024-07-18 10:32:52 ^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-07-18 10:32:52 File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 99, in handle_request
2024-07-18 10:32:52 raise exc
2024-07-18 10:32:52 File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 76, in handle_request
2024-07-18 10:32:52 stream = self._connect(request)
2024-07-18 10:32:52 ^^^^^^^^^^^^^^^^^^^^^^
2024-07-18 10:32:52 File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 154, in _connect
2024-07-18 10:32:52 stream = stream.start_tls(**kwargs)
2024-07-18 10:32:52 ^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-07-18 10:32:52 File "/app/.venv/lib/python3.12/site-packages/httpcore/_backends/sync.py", line 152, in start_tls
2024-07-18 10:32:52 with map_exceptions(exc_map):
2024-07-18 10:32:52 File "/usr/local/lib/python3.12/contextlib.py", line 155, in exit
2024-07-18 10:32:52 self.gen.throw(value)
2024-07-18 10:32:52 File "/app/.venv/lib/python3.12/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions
2024-07-18 10:32:52 raise to_exc(exc) from exc
2024-07-18 10:32:52 httpcore.ConnectError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1000)
2024-07-18 10:32:52
2024-07-18 10:32:52 The above exception was the direct cause of the following exception:
2024-07-18 10:32:52
2024-07-18 10:32:52 Traceback (most recent call last):
2024-07-18 10:32:52 File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 978, in _request
2024-07-18 10:32:52 response = self._client.send(
2024-07-18 10:32:52 ^^^^^^^^^^^^^^^^^^
2024-07-18 10:32:52 File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 914, in send
2024-07-18 10:32:52 response = self._send_handling_auth(
2024-07-18 10:32:52 ^^^^^^^^^^^^^^^^^^^^^^^^^
2024-07-18 10:32:52 File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 942, in _send_handling_auth
2024-07-18 10:32:52 response = self._send_handling_redirects(
2024-07-18 10:32:52 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-07-18 10:32:52 File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 979, in _send_handling_redirects
2024-07-18 10:32:52 response = self._send_single_request(request)
2024-07-18 10:32:52 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-07-18 10:32:52 File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 1015, in _send_single_request
2024-07-18 10:32:52 response = transport.handle_request(request)
2024-07-18 10:32:52 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-07-18 10:32:52 File "/app/.venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 232, in handle_request
2024-07-18 10:32:52 with map_httpcore_exceptions():
2024-07-18 10:32:52 File "/usr/local/lib/python3.12/contextlib.py", line 155, in exit
2024-07-18 10:32:52 self.gen.throw(value)
2024-07-18 10:32:52 File "/app/.venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 86, in map_httpcore_exceptions
2024-07-18 10:32:52 raise mapped_exc(message) from exc
2024-07-18 10:32:52 httpx.ConnectError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1000)
2024-07-18 10:32:52
2024-07-18 10:32:52 The above exception was the direct cause of the following exception:
2024-07-18 10:32:52
2024-07-18 10:32:52 Traceback (most recent call last):
2024-07-18 10:32:52 File "/app/.venv/lib/python3.12/site-packages/starlette/routing.py", line 734, in lifespan
2024-07-18 10:32:52 async with self.lifespan_context(app) as maybe_state:
2024-07-18 10:32:52 File "/usr/local/lib/python3.12/contextlib.py", line 204, in aenter
2024-07-18 10:32:52 return await anext(self.gen)
2024-07-18 10:32:52 ^^^^^^^^^^^^^^^^^^^^^
2024-07-18 10:32:52 File "/src/main.py", line 28, in lifespan
2024-07-18 10:32:52 container.init_globals()
2024-07-18 10:32:52 File "/src/globals.py", line 36, in init_globals
2024-07-18 10:32:52 llm_provider, embedder_provider, document_store_provider, engine = init_providers()
2024-07-18 10:32:52 ^^^^^^^^^^^^^^^^
2024-07-18 10:32:52 File "/src/utils.py", line 67, in init_providers
2024-07-18 10:32:52 llm_provider = loader.get_provider(os.getenv("LLM_PROVIDER", "openai_llm"))()
2024-07-18 10:32:52 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-07-18 10:32:52 File "/src/providers/llm/openai.py", line 138, in init
2024-07-18 10:32:52 _verify_api_key(self._api_key.resolve_value(), self._api_base)
2024-07-18 10:32:52 File "/src/providers/llm/openai.py", line 129, in _verify_api_key
2024-07-18 10:32:52 OpenAI(api_key=api_key, base_url=api_base).models.list()
2024-07-18 10:32:52 File "/app/.venv/lib/python3.12/site-packages/openai/resources/models.py", line 80, in list
2024-07-18 10:32:52 return self._get_api_list(
2024-07-18 10:32:52 ^^^^^^^^^^^^^^^^^^^
2024-07-18 10:32:52 File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1315, in get_api_list
2024-07-18 10:32:52 return self._request_api_list(model, page, opts)
2024-07-18 10:32:52 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-07-18 10:32:52 File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1160, in _request_api_list
2024-07-18 10:32:52 return self.request(page, options, stream=False)
2024-07-18 10:32:52 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-07-18 10:32:52 File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 942, in request
2024-07-18 10:32:52 return self._request(
2024-07-18 10:32:52 ^^^^^^^^^^^^^^
2024-07-18 10:32:52 File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1002, in _request
2024-07-18 10:32:52 return self._retry_request(
2024-07-18 10:32:52 ^^^^^^^^^^^^^^^^^^^^
2024-07-18 10:32:52 File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1079, in _retry_request
2024-07-18 10:32:52 return self._request(
2024-07-18 10:32:52 ^^^^^^^^^^^^^^
2024-07-18 10:32:52 File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1002, in _request
2024-07-18 10:32:52 return self._retry_request(
2024-07-18 10:32:52 ^^^^^^^^^^^^^^^^^^^^
2024-07-18 10:32:52 File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1079, in _retry_request
2024-07-18 10:32:52 return self._request(
2024-07-18 10:32:52 ^^^^^^^^^^^^^^
2024-07-18 10:32:52 File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1012, in _request
2024-07-18 10:32:52 raise APIConnectionError(request=request) from err
2024-07-18 10:32:52 openai.APIConnectionError: Connection error.
2024-07-18 10:32:52
2024-07-18 10:32:52 ERROR: Application startup failed. Exiting.
2024-07-18 10:32:18 Waiting for wren-ai-service to start...
2024-07-18 10:32:44 Waiting for wren-ai-service to start...

Expected behavior
We should able to successfully setup wren ai and start wren-ai-service application and should be able to ask questions to database to get the expected output from connected database

Desktop (please complete the following information):

  • OS: Windows 10 Enterprise
  • Browser: Chrome

Wren AI Information

  • LLM_PROVIDER= # openai
  • GENERATION_MODEL= # gpt-3.5-turbo, gpt4.0

This issue is related to users' host environment. We'll help figure out why this issue happened at the first place.

cyyeh commented

sorry, I couldn't reproduce the issue as of now