I need a little help as I may have an issue with my setup.
Closed this issue · 3 comments
hjyuh commented
Describe the bug
Once I run ollama, the site pops up. When I try and prompt one of the models, I simply keep loading and loading, or states 'There was an error processing your request: Invalid or missing API key.' When I try to prompt Ollama, I get this.
Just no option to select.
Link to the Bolt URL that caused the error
I ran on local host.
Steps to reproduce
I simply downloaded this following this doc: https://docs.google.com/document/d/19UNRP1c6ulDS_X7Ig7mRTI_EaT0xcvgimnfOJDKm7ig/edit?tab=t.0
Expected behavior
I expected the qwen AI prompt option.
Screen Recording / Screenshot
Platform
- OS: [ Windows,]
- Browser: [Firefox]
- Version: [e.g. 91.1]
Provider Used
No response
Model Used
All models dont work.
Additional context
No response
Soumyaranjan-17 commented
show the screenshot of
.env
your browser dev terminal
and your Code editor terminal
hjyuh commented
hello,
I managed to find and remove the error.
Thanks.
Soumyaranjan-17 commented
Can you explain your problem and the steps to fix it