BerriAI/litellm

[Bug]: Does x-ai not support the response_format parameter?

Opened this issue · 2 comments

What happened?

image

    litellm.set_verbose = True
    litellm.enable_json_schema_validation = True
    ret = completion(**kwargs)

sdk version:

litellm 1.51.3

Relevant log output

litellm.exceptions.BadRequestError: litellm.BadRequestError: XaiException - Error code: 400 - {'code': 'Client specified an invalid argument', 'error': 'The response_format parameter is unsupported at the moment. Leave it unset.'}

Twitter / LinkedIn details

No response

from litellm import completion
import os

if __name__ == "__main__":
    os.environ["XAI_API_KEY"] = ""
    response = completion(
        model="xai/grok-beta",
        messages=[
            {
                "role": "user",
                "content": "讲个笑话",
            }
        ],
        max_tokens=10,
        response_format={"type": "json_object"},
        seed=123,
        stop=["\n\n"],
        temperature=0.2,
        top_p=0.9,
        tool_choice="auto",
        tools=[],
        user="user",
    )
    print(response)

litellm 1.52.0

This example has the same problem

Try this way

response = completion(
    model="xai/grok-beta",
    messages=[
        {
            "role": "user",
            "content": "Why is the sky blue?",
        }
    ],
    max_tokens=10,
    response_format={ "type": "json_object" },
    seed=123,
    stop=["\n\n"],
    temperature=0.2,
    top_p=0.9,
    user="user",
    drop_params=True
)
print(response)