SchnapsterDog/nuxt-chatgpt

Chatcompletion array of messages

mauroIstat opened this issue · 3 comments

Hi,
your project is awesome and it works great with almost no effort :)

Nevertheless I would suggest to change the interface of the chatCompletion function:

const chatCompletion = async (message: IMessage, model?: IModel, options?: IOptions)

to

const chatCompletion = async (messages: IMessages, model?: IModel, options?: IOptions)

or something similar :)

According to openAI API specification, chatCompletions can be configured with a set of messages:

const response = await openai.chat.completions.create({
model: "gpt-3.5-turbo",
messages: [{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Who won the world series in 2020?"},
{"role": "assistant", "content": "The Los Angeles Dodgers won the World Series in 2020."},
{"role": "user", "content": "Where was it played?"}],
});

This change would allow to implement easily use cases such as the "Socratic Tutor" https://platform.openai.com/examples/default-socratic-tutor.

All the best,
Mauro

Hi @mauroIstat

Thanks for the feedback. I really appreciate it.

It is actually a nice idea. I will need some time to test it, but I will put this as first thing in my ToDo list.

All the best,

Oliver

@mauroIstat Firstly, I want to say sorry for the late response. Now the chatCompletion can be used as "Socratic Tutor" and is configured to sent a set of messages. I updated the Readme.md file, where you can easy try and test the new functionality that offer the module. Thanks :)

Oliver