[FEATURE REQ] make `include_usage` optional in chat/completions API
Opened this issue · 2 comments
Describe the feature or improvement you are requesting
Hello,
Thank you again for maintaining the C# client.
Many other services now use the openai chat/completions API, and mistral, for example, does not support the include_usage flag. When set, they return 422.
The python client has a workaround by allowing an override for the streaming options, see openai/openai-agents-python#442
In the C# client, this is impossible for two reasons:
InternalChatCompletionStreamOptionsandChatCompletionOptions.StreamOptionswould have to be made public; and- The ChatClient would have to respect the override. Currently the code always sets the
StreamOptionsto a private static instance:
private void CreateChatCompletionOptions(IEnumerable<ChatMessage> messages, ref ChatCompletionOptions options, bool stream = false)
{
options.Messages = messages.ToList();
options.Model = _model;
if (stream)
{
options.Stream = true;
options.StreamOptions = s_includeUsageStreamOptions;
}
else
{
options.Stream = null;
options.StreamOptions = null;
}
}
with
private static readonly InternalChatCompletionStreamOptions s_includeUsageStreamOptions = new(includeUsage: true, additionalBinaryDataProperties: null);
(note includeUsage being set to true here).
With ChatCompletionOptions.StreamOptions public, we can write:
ChatCompletionOptions options = new();
options.StreamOptions = new InternalChatCompletionStreamOptions();
options.StreamOptions.IncludeUsage = null;
and changing CreateChatCompletionOptions to honor the StreamOptions when set like this:
private void CreateChatCompletionOptions(IEnumerable<ChatMessage> messages, ref ChatCompletionOptions options, bool stream = false)
{
options.Messages = messages.ToList();
options.Model = _model;
if (stream)
{
options.Stream = true;
if (options.StreamOptions == null)
options.StreamOptions = s_includeUsageStreamOptions;
}
else
{
options.Stream = null;
options.StreamOptions = null;
}
}
The C# client then becomes MistralAI compatible.
I understand this is not the main focus, but seeing that the fix was accepted in python, I was hoping it could also be done in C#.
I personally don't see why InternalChatCompletionStreamOptions should be internal - this is well documented.
I am happy to submit a pull request.
Additional context
No response
Hi @gaspardpetit. Thanks for reaching out and for your suggestion. I'm inclined to agree that this should be exposed as an option, as I cannot find any rationale for forcing the value. Looking over the Platform docs and REST API spec, both define this as an optional boolean. There's nothing that implies that it must be set for streaming.
I'm not sure that I agree with the suggested implementation, but we can dig into design once this gets picked up.
Unfortunately, there's no easy way to override this value in the interim. Because the options are forced so late in the cycle, the best that I was able to come up with was to write a custom policy to replace the entire request body.
@christothes, this is another scenario that could/should be addressed with protocol models. Moreover, because it's a streaming scenario, it might be a very good idea validation case.