huggingface/llm-vscode

Latest version is crashing when using autocomplete

darolt opened this issue · 3 comments

I am using this extension on a private model. It works fine in version v0.1.0. However it does not work in v0.1.6.
When typing I get a pop-up with this message:
Pending response rejected since connection got disposed
In the output the logs show:

2024-01-19 10:28:34.333 [info] thread 'main' panicked at /home/runner/.cargo/registry/src/index.crates.io-6f17d22bba15001f/ropey-1.6.0/src/rope.rs:803:13:
Attempt to index past end of Rope: char index 155, Rope char length 155
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
2024-01-19 10:28:34.334 [info] [Info  - 10:28:34 AM] {llm-ls} file changed

Apparently the error occurs only when the cursor is at the end of the file. In other positions, it seems to work.

After 5 attempts, it gives the following message:
The LLM VS Code server crashed 5 times in the last 3 minutes. The server will not be restarted. See the output for more information.

For runtime status, it seems ok:
image

The VS Code details:
image

This issue is stale because it has been open for 30 days with no activity.

Does upgrading in 0.2.x fix the issue?

This issue is stale because it has been open for 30 days with no activity.