huggingface/llm-vscode

Weird behavior with "codellama/CodeLlama-13b-hf"

icnahom opened this issue · 7 comments

Suggestion is not being displayed when using CodeLlama. This is not the case with Starcoder, it shows the suggestion in the line that's triggered from.

Here are the attempts:

Requesting from Line 5, nothing gets displayed.

image

Requesting from Line 6 with a one space to the right, shows the suggestion.

image

Settings

{
    "llm.fillInTheMiddle.enabled": true,
    "llm.fillInTheMiddle.prefix": "<PRE> ",
    "llm.fillInTheMiddle.middle": " <MID>",
    "llm.fillInTheMiddle.suffix": " <SUF>",
    "llm.temperature": 0.2,
    "llm.contextWindow": 4096,
    "llm.tokensToClear": [
        "<EOT>"
    ],
    "llm.tokenizer": {
        "repository": "codellama/CodeLlama-13b-hf"
    },
    "llm.enableAutoSuggest": true,
    "llm.maxNewTokens": 256,
    "llm.configTemplate": "codellama/CodeLlama-13b-hf",
    "llm.modelIdOrEndpoint": "codellama/CodeLlama-13b-hf"
}

Example Code

package main
import "fmt"

func main() {
// Cursor here
func() { messages <- "ping" }()
msg = <-messages
fmt.Println(msg)
}

VSCode

Version: 1.85.0

I'm not sure this is due to the model, rather the code that checks if we should show a suggestion or not.
What version of llm-vscode & llm-ls are you running?

@McPatate llm-vscode: v0.1.6

How do I check for llm-ls?

You don't need to sorry, llm-ls comes bundled with the extension.

This issue is stale because it has been open for 30 days with no activity.

Hey @icnahom , did you find a fix for this please ? Thanks !

Hey @icnahom , did you find a fix for this please ? Thanks !

I haven't looked into it yet.

This issue is stale because it has been open for 30 days with no activity.