aurelio-labs/semantic-router

02-Dynamic-Router.ipynb Error after `rl.add(time_route)`

Closed this issue · 3 comments

Hi, I tried to use the the Notebook docs/02-dynamic-routes.ipynb on Google Colab and use Cohere api key as embedding model.
The static route works fine, but when using dynamic router it throw error on the last cell (as Screenshot)

Error:

2024-03-10 07:38:16 WARNING semantic_router.utils.logger No LLM provided for dynamic route, will use OpenAI LLM default. Ensure API key is set in OPENAI_API_KEY environment variable.
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
[<ipython-input-11-f02444f1667a>](https://localhost:8080/#) in <cell line: 1>()
----> 1 out = rl("what is the time in new york city?")
      2 get_time(**out.function_call)

1 frames
[/usr/local/lib/python3.10/dist-packages/semantic_router/llms/openai.py](https://localhost:8080/#) in __init__(self, name, openai_api_key, temperature, max_tokens)
     27         api_key = openai_api_key or os.getenv("OPENAI_API_KEY")
     28         if api_key is None:
---> 29             raise ValueError("OpenAI API key cannot be 'None'.")
     30         try:
     31             self.client = openai.OpenAI(api_key=api_key)

ValueError: OpenAI API key cannot be 'None'.

For some reason it not detect that already use cohere like in the static route.

image

Hey can you try setting the OPENAI_API_KEY env var? Ie os.environ[“OPENAI_API_KEY”] = “your api key” — that should help :)

Hi @jamescalam thanks for the reply!

The think is, is this behaviour intended? What got me confuse is at first cell we already success with the static route using cohere API and how does it different? I saw your tutorial using local and its not happening in huggingface encoder.

I'm still new with semantic router, and feels like this is the way forward with routing especially with OSS model that is not reliable.my question is

  1. Is there different llm for embedding the route layer and the one that generate output?
  2. I will try with openai later and get back to you, but the question remain, why the dynamic router is working differently with static router?
  3. Rather OOT but can you help with sample code using Ollama? I saw it's already merged but still cannot understand how to use it. Is Ollama using huggingface encoder? and how to set it if I want to use it with nomic-embed-text?

Hi @antonsapt4 yes this is the expected behaviour.

  1. Yes an LLM (ie OpenAI in this example) is required for dynamic routes, whereas static routes use embedding models (ie Cohere in this example) only.
  2. As above, the dynamic route requires an LLM, whereas static route does not.
  3. Yes there is an example here.

I hope those help!