Pythagora-io/pythagora

Configure with local LLM?

uninstallit opened this issue · 1 comments

Is it possible to configure with local llm - i.e.: Ollama or LlamaCpp servers?

Some code cannot be sent to open apis.

Thanks

Hey @uninstallit sorry, we are not maintaining this project anymore. Our full focus is on GPT Pilot at the moment