Request guidance/feature to stream streaming tool responses back
Opened this issue · 0 comments
weihaoxia-01 commented
Is your feature request related to a problem? Please describe.
Is it possible to stream streaming tool responses back?
If the tool sends back streaming chunks like
const res = await client.fetch(...)
for await (const chunk of res.body) {...}
is it possible to stream the chunks back immediately?
Describe the solution you'd like
Please provide a guidance on how to do this kind of streaming, or implement a new feature for that.
Describe alternatives you've considered
https://modelcontextprotocol.io/specification/2025-03-26/basic/utilities/progress provides a notification mechanism, but not all clients are using that.
Additional context