bidah/react-native-vercel-ai

Example doesn't work

Closed this issue · 3 comments

The examples posted in the documentation don't really work.
The structure of the objects being sent to the server doesn't match the example.
Based on my testing the structure of the objects sent back from the server don't match what the client expects as an object either.

bidah commented

The example app does not include the API only the RN app. So as per the docs for the non streaming response you should grab the message and send it back on the data key. eg. return NextResponse.json({ data: response.choices[0].message });

But that being said I do think the example app needs the counterpart of the API so that its easier to run or maybe that the readme of the app shares the default way to interact with it. Will try to make this easier to run when picking up the example app @rkeshwani . Thanks for the finding.

The streaming api was not working for me while I was testing using Expo Go, so I commented that part out and adjusted the server side code as such and it started working well note that this gives openai the full context of the chat conversation vs just the previous message:

try {
    const { messages: prompt } = await req.json();
    console.log('prompt', prompt);
    const userAgentData = userAgent(req);
    const isNativeMobile = userAgentData.ua?.includes('Expo');

    // if (!isNativeMobile) {
    //   // Ask OpenAI for a streaming chat completion given the prompt
    //   const response = await openai.chat.completions.create({
    //     model: 'gpt-3.5-turbo',
    //     stream: true,
    //     messages: prompt,
    //   });
    //   // Convert the response into a friendly text-stream
    //   const stream = OpenAIStream(response);
    //   // Respond with the stream
    //   // console.log(stream);
    //   return new StreamingTextResponse(stream);
    // } else {
    // Ask OpenAI for a streaming chat completion given the prompt
    const response = await openai.chat.completions.create({
      model: 'gpt-3.5-turbo',
      // Set your provider stream option to be `false` for native
      stream: false,
      messages: prompt,
    });
    console.log(response);
    return NextResponse.json({ data: response.choices[0].message });
    // }
}
bidah commented

Added a new example app with expo-router and a next.js app for the /chat /API route. Double tested and should be working now too.