Weird behavior with "codellama/CodeLlama-13b-hf"
icnahom opened this issue · 7 comments
icnahom commented
Suggestion is not being displayed when using CodeLlama. This is not the case with Starcoder, it shows the suggestion in the line that's triggered from.
Here are the attempts:
Requesting from Line 5, nothing gets displayed.
Requesting from Line 6 with a one space to the right, shows the suggestion.
Settings
{
"llm.fillInTheMiddle.enabled": true,
"llm.fillInTheMiddle.prefix": "<PRE> ",
"llm.fillInTheMiddle.middle": " <MID>",
"llm.fillInTheMiddle.suffix": " <SUF>",
"llm.temperature": 0.2,
"llm.contextWindow": 4096,
"llm.tokensToClear": [
"<EOT>"
],
"llm.tokenizer": {
"repository": "codellama/CodeLlama-13b-hf"
},
"llm.enableAutoSuggest": true,
"llm.maxNewTokens": 256,
"llm.configTemplate": "codellama/CodeLlama-13b-hf",
"llm.modelIdOrEndpoint": "codellama/CodeLlama-13b-hf"
}
Example Code
package main
import "fmt"
func main() {
// Cursor here
func() { messages <- "ping" }()
msg = <-messages
fmt.Println(msg)
}
VSCode
Version: 1.85.0
McPatate commented
I'm not sure this is due to the model, rather the code that checks if we should show a suggestion or not.
What version of llm-vscode
& llm-ls
are you running?
McPatate commented
You don't need to sorry, llm-ls
comes bundled with the extension.
github-actions commented
This issue is stale because it has been open for 30 days with no activity.
icnahom commented
Hey @icnahom , did you find a fix for this please ? Thanks !
I haven't looked into it yet.
github-actions commented
This issue is stale because it has been open for 30 days with no activity.