Can the new version of MatGPT streamingly receive the responses from LLM? If not, how can we realize it?
huliangbing opened this issue · 5 comments
huliangbing commented
Can the new version of MatGPT streamingly receive the responses from LLM?
If not, how can we realize it?
toshiakit commented
This is a duplicate of #23 How to streamingly receive replies from a large model using Matlab?
Please see my answer there.
huliangbing commented
Thank you very much!
I will have a try using matlab.net.http.io.StringConsumer.
toshiakit commented
This feature is added in LLMs with MATLAB.
% Define the function handle. This function print the returned text.
sf = @(x)fprintf("%s",x);
% Create the chat object with the function handle.
chat = openAIChat(StreamFun=sf);
% Generate response to a prompt in streaming mode.
prompt = "What is Model-Based Design?";
[text, message, response] = generate(chat,prompt);
huliangbing commented
Thanks a lot!
toshiakit commented
This is supported in 2.0.1