aws-samples/aws-serverless-openai-chatbot-demo

Support with streaming outputs from OpenAI APIs

Opened this issue · 0 comments

In the lambda function, I could see that 'stream' parameter is not taken as an argument. That would need significant refactoring of the code as one needs to wait for the data to arrive. Has someone implemented with streaming?