slackapi/bolt-js

Slack - Support of LLM streaming

Closed this issue · 6 comments

Hey everyone,

Currently LLM APIs (like the OpenAI API) stream the LLM response token by token, since waiting for the entire response takes usually ~ 7-10 seconds.

Is there any intention in supporting streaming in the Slack platform? Namely, I'd like to build a chatbot in Slack and I want to stream its answers to the users. Currently there's no easy way of doing that and I simply post the answer once the LLM has finished.

I was thinking of doing multiple "edits" to the original message but I'm afraid it'd be rate-limited due to lots of API calls.

If you can shed some light on this issue, that would be great :)

Hey @david1542 👋 👾 This is a really interesting request and I'll share it with the team, but I don't believe there are any immediate plans for streaming HTTP requests with the Web API.

FWIW I've also experimented with this and found multiple edits to be alright. I get rate limited often though and will probably change this to send only one update per second to chat.update 🤪

Thanks for the quick reply! @zimeg The edits solution seems reasonable for now, I think I'll give it a shot :)
I'd love to see native streaming in Slack though! I think it'd improve the UX of all the LLM bots these days

Hey @david1542, my example also does similar for reflecting chunked responses from OpenAI's API: https://github.com/seratch/ChatGPT-in-Slack Hope this helps.

👋 It looks like this issue has been open for 30 days with no activity. We'll mark this as stale for now, and wait 10 days for an update or for further comment before closing this issue out. If you think this issue needs to be prioritized, please comment to get the thread going again! Maintainers also review issues marked as stale on a regular basis and comment or adjust status if the issue needs to be reprioritized.

As this issue has been inactive for more than one month, we will be closing it. Thank you to all the participants! If you would like to raise a related issue, please create a new issue which includes your specific details and references this issue number.

this features gives a great user experience for llm bot in slack. Any future plans on this?