babaohuang/GeminiProChat

how to config the chosen model?

Opened this issue · 1 comments

How is Gemini Pro Chat deployed?

Node

Describe the bug

the gemini current version is gemini 1.5 pro, how to config the model parameter?

Console Logs

No response

Participation

  • I am willing to submit a pull request for this issue.

Modify line 11 of the / src/utils/openAI.ts file:

Const model = genAI.getGenerativeModel ({model: 'gemini-pro'})
  • Parameter: model
  • Parameter value: gemini-1.5- pro

Eg:

Const model = genAI.getGenerativeModel ({model: 'gemini-1.5-pro'})