twinnydotdev/twinny
The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
TypeScriptMIT
Issues
- 3
- 8
Outputs only "undefined"
#202 opened - 2
- 5
Support Comments Translation
#200 opened - 2
- 2
Ideal setup of parallel chat and fim models
#197 opened - 1
[Feature] Jetbrains plugins
#195 opened - 3
No robot icon, no completion
#194 opened - 2
Cannot chat successfully with ollama
#192 opened - 2
- 4
- 2
et API Bearer Token is not working
#189 opened - 11
Oobabooga vs. Twinny
#180 opened - 3
- 4
Support for web version of vscode?
#177 opened - 1
Add support to Deci-based models
#176 opened - 1
- 4
- 1
- 1
- 4
- 2
- 2
Improve docs with step by step setup
#164 opened - 6
- 11
Problem with https/tls configuration -
#161 opened - 5
- 1
- 7
Can't get connected to my Ollama
#155 opened - 3
- 6
- 7
- 7
- 22
Support FIM for models using ChatML format
#142 opened - 6
Support Starcoder models for FIM
#141 opened - 5
Bad/no suggestions in chat and FIM
#137 opened - 3
- 2
Generate commit message with git diff
#134 opened - 6
Generation gets interrupted
#133 opened - 1
Please update Open VSX version
#132 opened - 7
Fetch failed when using a remote https URL
#131 opened - 2
connecting oobabooga API with twinny
#129 opened - 2
Custom system messages per task type
#123 opened - 1
On demand scroll down
#122 opened - 1
- 10
- 2
Code completion not working
#118 opened - 4
twinny gets garbage data from llama.cpp
#116 opened - 2
- 4
- 1