toshiakit/MatGPT

Can the new version of MatGPT streamingly receive the responses from LLM? If not, how can we realize it?

huliangbing opened this issue · 5 comments

Can the new version of MatGPT streamingly receive the responses from LLM? If not, how can we realize it?

This is a duplicate of #23 How to streamingly receive replies from a large model using Matlab?
Please see my answer there.

Thank you very much!
I will have a try using matlab.net.http.io.StringConsumer.

This feature is added in LLMs with MATLAB.

% Define the function handle. This function print the returned text. 
sf = @(x)fprintf("%s",x);
% Create the chat object with the function handle. 
chat = openAIChat(StreamFun=sf);
% Generate response to a prompt in streaming mode. 
prompt = "What is Model-Based Design?";
[text, message, response] = generate(chat,prompt);

Thanks a lot!

This is supported in 2.0.1