twinnydotdev/twinny
The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
TypeScriptMIT
Issues
- 1
[FIM+Ollama+qwen2.5+Repository level] RangeError: path should be a `path.relative()`d string
#426 opened by derjanb - 2
Where did conversation history go? Where did the message I typed go? Why no sane provider defaults? Why no OpenAI-like providers?
#355 opened by atljoseph - 2
- 1
[BUG] Dead Link in Docs
#421 opened by spenpal - 0
404 when using separate LLM server on network
#420 opened by AshkanArabim - 4
twinny doesn't work in dev container
#356 opened by jerry072036 - 1
Not working with dev containers?
#419 opened by vitobotta - 4
Embed File Type Filter
#326 opened by jdspugh - 1
Add support for FIM Codestral API
#405 opened by robertpiosik - 2
Help wanted! Add translations for your native language and become a contributor!
#395 opened by rjmacarthy - 1
Added a BOS token to the prompt as specified by the model but the prompt also starts with a BOS token.
#314 opened by machisuji - 3
Code completion broken (nonsense output)
#338 opened by vrgimael - 1
Random blabbering Nonsense
#413 opened by Dejon141 - 2
Large files cause freeze when embedding
#386 opened by rjmacarthy - 1
Include all open files in FIM requests
#406 opened by robertpiosik - 2
Utilizing Open WebUI Embeddings with Ollama i.e., `/ollama/v1/embeddings`
#324 opened by unclemusclez - 7
- 1
"Consider to use 'storageUri' or 'globalStorageUri' to store this data on disk instead."
#400 opened by unclemusclez - 4
Errors occur when using Twinny and certain AI extensions (such as FitenCode) in Vscode together
#391 opened by ZhongGuangshen - 5
Chat not working no matter the model used
#315 opened by machisuji - 9
embedding doesn't work
#387 opened by jurykor - 7
"Embed workspace documents" doesn't work properly
#384 opened by makiz1999 - 0
Display Connection Failure Prompt When Large Model Dialogue Times Out or Returns Errors (e.g., 404)
#373 opened by yuzcat01 - 2
FIM autocomplete format not being changed correctly
#377 opened by paryska99 - 0
Problem with duplicate conversation history
#379 opened by fishshi - 4
The 'Embedded workspace documents' option excludes files from processing (by their name) based on unclear rules.
#350 opened by jurykor - 3
Modify the logic for matching the FILE_IGNORE_LIST
#370 opened by fishshi - 3
- 1
Can't find module @lancedb/lancedb-win32-x64-msvc
#364 opened by fishshi - 3
Twinny Extension won't load. Lancedb module missing
#365 opened by gpinkham - 1
Cant seem to get chat completion working?
#363 opened by phyzical - 0
Back button does not hide when using the activity bar to return to the chat page
#360 opened by fishshi - 3
Trouble running with Windows Ollama
#354 opened by RandoOnSteam - 9
Twinny tries ollama URL for oobabooga embeddings
#336 opened by allo- - 2
Embedding provider selection reverts to "default" one.
#351 opened by vkx86 - 3
Providing reference files in a prompt template?
#330 opened by gpinkham - 4
Deafult Embedding Option Auto-Reverts
#334 opened by unclemusclez - 3
Stuck GUI
#335 opened by unclemusclez - 13
Twinny Extension crashing on VsCodium and VSCode
#319 opened by tatojunior - 1
- 7
Open WebUI update
#329 opened by unclemusclez - 2
Stream doesnt close
#343 opened by dkpoulsen - 1
OpenAI: o1 models fail to work
#321 opened by JamesClarke7283 - 2
- 1
better log implementation.
#331 opened by MolikoDeveloper - 14
Add support for qwen2.5-coder
#317 opened by kv-gits - 2
full response in the tittle
#328 opened by MolikoDeveloper - 0
- 1
Custom Templates not working with 3.17
#323 opened by unclemusclez - 0
Managing prompt context
#320 opened by kv-gits