Feature Request: TextStreaming
hushaudio opened this issue · 2 comments
hushaudio commented
Is it possible to add a text streaming feature? It looks like your loading a local cpp server I wonder does swift support sockets for react native? Inference is so slow right on mobile devices right now, streaming would help the user know something is happening. Interested in contributing if you need contribs. I believe it is supported by llama.cpp in langchains implementation but im not sure if that's custom
Vali-98 commented
Hey there, I maintain an app that uses llama.rn so I have some pointers on this. Text can be streamed using the call back function of LlamaContext.completion:
async completion(
params: CompletionParams,
callback?: (data: TokenData) => void,
): Promise<NativeCompletionResult>
Simply pass in a callback function that inserts the new token to update some state.
hushaudio commented
oh amazing, thank you so much!