BerriAI/litellm

[Bug]:

Closed this issue · 1 comments

What happened?

when using lm = dspy.OpenAI(api_base="http://0.x.x.53:8000/v1/", api_key='EMPTY', model="/root/models/Meta-Llama-3.1-70B-Instruct", max_tokens=200) I can easily get responses from the LM running on specified url.

However when using: lm = dspy.LM(model='openai/root/models/Meta-Llama-3.1-70B-Instruct', api_key="EMPTY", api_base="http://0.x.x.53:8000/v1/") I get the key Error from below.

Relevant log output

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

Traceback (most recent call last):
  File "/home/nilsb/testDir/venv/lib/python3.11/site-packages/litellm/main.py", line 2896, in completion
    string_response = response_json["data"][0]["output"][0]
                      ~~~~~~~~~~~~~^^^^^^^^
KeyError: 'data'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/nilsb/testDir/dspy-Communication.py", line 15, in <module>
    result = qa(question="Was ist der Unterschied zwischen High Memory und Low Memory in Linux?")
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/nilsb/testDir/venv/lib/python3.11/site-packages/dspy/utils/callback.py", line 202, in wrapper
    return fn(instance, *args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/nilsb/testDir/venv/lib/python3.11/site-packages/dspy/predict/predict.py", line 121, in __call__
    return self.forward(**kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^
  File "/home/nilsb/testDir/venv/lib/python3.11/site-packages/dspy/predict/predict.py", line 155, in forward
    completions = v2_5_generate(lm, config, signature, demos, kwargs, _parse_values=self._parse_values)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/nilsb/testDir/venv/lib/python3.11/site-packages/dspy/predict/predict.py", line 264, in v2_5_generate
    return adapter(
           ^^^^^^^^
  File "/home/nilsb/testDir/venv/lib/python3.11/site-packages/dspy/adapters/base.py", line 19, in __call__
    outputs = lm(**inputs_, **lm_kwargs)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/nilsb/testDir/venv/lib/python3.11/site-packages/dspy/utils/callback.py", line 202, in wrapper
    return fn(instance, *args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/nilsb/testDir/venv/lib/python3.11/site-packages/dspy/clients/lm.py", line 71, in __call__
    response = completion(ujson.dumps(dict(model=self.model, messages=messages, **kwargs)))
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/nilsb/testDir/venv/lib/python3.11/site-packages/dspy/clients/lm.py", line 159, in cached_litellm_completion
    return litellm_completion(request, cache={"no-cache": False, "no-store": False})
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/nilsb/testDir/venv/lib/python3.11/site-packages/dspy/clients/lm.py", line 164, in litellm_completion
    return litellm.completion(cache=cache, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/nilsb/testDir/venv/lib/python3.11/site-packages/litellm/utils.py", line 1070, in wrapper
    raise e
  File "/home/nilsb/testDir/venv/lib/python3.11/site-packages/litellm/utils.py", line 958, in wrapper
    result = original_function(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/nilsb/testDir/venv/lib/python3.11/site-packages/litellm/main.py", line 2957, in completion
    raise exception_type(
          ^^^^^^^^^^^^^^^
  File "/home/nilsb/testDir/venv/lib/python3.11/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2131, in exception_type
    raise e
  File "/home/nilsb/testDir/venv/lib/python3.11/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2107, in exception_type
    raise APIConnectionError(
litellm.exceptions.APIConnectionError: litellm.APIConnectionError: 'data'
Traceback (most recent call last):
  File "/home/nilsb/testDir/venv/lib/python3.11/site-packages/litellm/main.py", line 2896, in completion
    string_response = response_json["data"][0]["output"][0]
                      ~~~~~~~~~~~~~^^^^^^^^
KeyError: 'data'

Twitter / LinkedIn details

No response

Hey. It needs a second '/' after the openai/ prefix. so the model_name should say "openai//root/models/Meta-Llama-3.1-70B-Instruct"