Future-House/paper-qa

The LLM recommendation for locally hosted paper-qa

Closed this issue · 1 comments

Can someone suggest any LLM recommended for using paper-qa on a locally hosted LLM? What is the minimum size of the locally hosted LLM to acquire a relatively good performance to use paper-qa?

Hey @huiyyu since our newest release launched with litellm, you can now locally host via ollama or llama.cpp to host locally and pass in the URL to your Settings. Check out our readme for more info.