AppcentMobile/ACMOpenAI-iOS

ACMOpenAI-Chat Streamed chat doesn't work - Response Parse Error

Closed this issue · 2 comments

Hi, I am encountering an issue when I want to use streamed chat on ACMOpenAI. This is the simple viewmodel I tried to run OpenAI's chat api.

`import Foundation
import ACMOpenAI

class ContentViewViewModel: ObservableObject {
var openAI = ACMOAIChatManager.init()
@published var responseText :String = ""

var messages: [[String: String]] = [
    ["role": "system", "content": "hello"],
    
]

func addMessage(message: String){
    messages.append(["role": "user", "content": message])
    openAI.create(request: .init(model: "gpt-3.5-turbo", messages: messages, stream: false)) { success in
        self.responseText = success?.choices?.first?.message?.content ?? ""
    
    } onError: { error in
        
    }
}

}
Like this, the response is coming in first time and it works. But when I addstream:trueparameter toopenAI.create` method, it throws me an error with this message:

❌ -> The given data was not valid JSON.

So I checked the response, and the response is different than stream: false. Here is the response of streamed chat.

image

Also response should not arrive all at once, it should come as a stream when I set stream: true

This task is now working progress.

Stream response support added.