Issues
- 0
is the pre-assembled dataset included in the repo in "alpaca" formatting?
#17 opened by jurassicjordan - 3
Get error on v1.0.1
#16 opened by windyjl - 3
It would be cool if one could access self-hosted generative models (Llama, WizardLLM, etc.)
#15 opened by darkhog - 5
Thank you! + A couple ideas
#2 opened by Kelin2025 - 0
Awesome addon :+1: Do you have plans to create a chat, and/or add configuration for the LLM endpoints, prompts and models?
#14 opened by Efoi - 2
Link Broken for Copilot Token Extraction?
#12 opened by MatMice - 1
- 4
Github copilot token, instead of OpenAI
#7 opened by Khalilbz - 0
Adding Codeium
#8 opened by padreputativo - 3
Great idea, kind of not a very good name
#6 opened by servel333 - 2
Dangling HttpRequests
#3 opened by 73v3