huggingface/llm-vscode

Client is not running

DjWarmonger opened this issue · 25 comments

That's what I get instead of autocompletion, whenever I type:
image

Quite cryptic TBH.

Runtime status:

Uncaught Errors (9)
write EPIPE
Client is not running and can't be stopped. It's current state is: starting
write EPIPE
write EPIPE
Client is not running and can't be stopped. It's current state is: starting
Pending response rejected since connection got disposed
Client is not running and can't be stopped. It's current state is: starting
Client is not running and can't be stopped. It's current state is: startFailed
Client is not running and can't be stopped. It's current state is: startFailed

Using Windows 11 with my local LM Studio server.

I have the same criptic error. Using llm-vscode version 0.1.5 vscode version 1.83.1 on mac os Ventura 13.5.2

The client isn't compatible with llama.cpp at the moment. As for the errors you're facing, I cannot debug without the full error message.

Could you share your settings as well?

that is all I am able to get as error message:

Screenshot 2023-10-18 at 10 42 59 AM

These are my extension settings:

{
  "llm.attributionEndpoint": "https://stack.dataportraits.org/overlap",
  "llm.attributionWindowSize": 250,
  "llm.configTemplate": "bigcode/starcoder",
  "llm.contextWindow": 8192,
  "llm.documentFilter": {
    "pattern": "**"
  },
  "llm.enableAutoSuggest": true,
  "llm.fillInTheMiddle.enabled": true,
  "llm.fillInTheMiddle.middle": "<fim_middle>",
  "llm.fillInTheMiddle.prefix": "<fim_prefix>",
  "llm.fillInTheMiddle.suffix": "<fim_suffix>",
  "llm.lsp.binaryPath": null,
  "llm.lsp.logLevel": "warn",
  "llm.maxNewTokens": 60,
  "llm.modelIdOrEndpoint": "bigcode/starcoder",
  "llm.temperature": 0.2,
  "llm.tlsSkipVerifyInsecure": false,
  "llm.tokenizer": null,
  "llm.tokensToClear": [
    "<|endoftext|>"
  ]
}

@NicolasAG it looks like you're running VSCode on a remote env, what platform is it?

That is correct! :)
Not sure to understand what you mean by "platform"..? is there a command you want me to run?

This is my setup: I have an interactive job mounting my code running an infinite loop. I ssh into that job directly in vscode to write code directly in servers and launch quick debug scripts.

I'm not sure llm-ls supports remote file setups for now, given it probably runs on your machine and not the remote host.

When I mean what platform I was asking what type of OS and architecture is your remote code stored on?

Ah right. my remote env is this: Linux 15f74b68-a0f7-4d9d-9be0-3258f1dda2de 6.2.0-32-generic #32~22.04.1-Ubuntu SMP PREEMPT_DYNAMIC Fri Aug 18 10:40:13 UTC 2 x86_64 x86_64 x86_64 GNU/Linux

But note that it was working fine the first time I installed it a few months ago...
Does it need a GPU to run inference or CPU only is fine? I recently changed my remote work environment to be CPU only.

It does not run the model locally, it queries an API that is by default our inference API but can be any API you choose.

There were some major changes recently, we know have https://github.com/huggingface/llm-ls running with the extension. Your error is saying that the connection between VSCode and llm-ls is broken for some reason. I work on MacOS silicon and have had no issues running the server though, and I'm pretty sure it should also work on x86 processors for MacOS.
My initial guess on what's wrong for your setup is that you're using VSCode on a remote directory, but not sure that's the root cause here.

I'm facing the same issue. I'm running on an x86 mac ventura 13.4, vscode 1.83.1, and llm-vscode 0.1.5. If I roll back the llm-vscode extension back to 0.0.38, everything works fine. After going back and forth multiple times (mostly to confirm before commenting) llm-vscode 0.1.5 is working again. I didn't change anything else and the remote instance running the model has not changed. When the llm-ls server was failing, there was an error message in the output tab thread 'main' panicked at crates/llm-ls/src/main.rs:723:18

I was able to get more details on the root cause of the error.
This is what I got right after logging into my remote Environment in VScode:

[Error - 1:27:33 PM] Client LLM VS Code: connection to server is erroring. Shutting down server.
[Error - 1:27:33 PM] Stopping server failed
Error: Client is not running and can't be stopped. It's current state is: starting
	at LanguageClient.shutdown (/home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/node_modules/vscode-languageclient/lib/common/client.js:914:19)
	at LanguageClient.stop (/home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/node_modules/vscode-languageclient/lib/common/client.js:885:21)
	at LanguageClient.stop (/home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/node_modules/vscode-languageclient/lib/node/main.js:150:22)
	at LanguageClient.handleConnectionError (/home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/node_modules/vscode-languageclient/lib/common/client.js:1146:18)
	at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
[Error - 1:27:33 PM] Server initialization failed.
  Message: write EPIPE
  Code: -32099 
[Error - 1:27:33 PM] LLM VS Code client: couldn't create connection to server.
  Message: write EPIPE
  Code: -32099 
[Info  - 1:27:33 PM] Connection to server got closed. Server will restart.
true
/home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls: /lib/x86_64-linux-gnu/libm.so.6: version `GLIBC_2.29' not found (required by /home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls)
/home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.32' not found (required by /home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls)
/home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.33' not found (required by /home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls)
/home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.28' not found (required by /home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls)
/home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.34' not found (required by /home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls)
/home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls: /lib/x86_64-linux-gnu/libm.so.6: version `GLIBC_2.29' not found (required by /home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls)
/home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.32' not found (required by /home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls)
/home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.33' not found (required by /home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls)
/home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.28' not found (required by /home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls)
/home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.34' not found (required by /home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls)
[Error - 1:27:33 PM] Server initialization failed.
  Message: Cannot call write after a stream was destroyed
  Code: -32099 
[Error - 1:27:33 PM] LLM VS Code client: couldn't create connection to server.
  Message: Cannot call write after a stream was destroyed
  Code: -32099 
[Error - 1:27:33 PM] Restarting server failed
  Message: Cannot call write after a stream was destroyed
  Code: -32099 
[Info  - 1:27:33 PM] Connection to server got closed. Server will restart.
true
/home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls: /lib/x86_64-linux-gnu/libm.so.6: version `GLIBC_2.29' not found (required by /home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls)
/home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.32' not found (required by /home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls)
/home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.33' not found (required by /home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls)
/home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.28' not found (required by /home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls)
/home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.34' not found (required by /home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls)
[Error - 1:27:33 PM] Server initialization failed.
  Message: Cannot call write after a stream was destroyed
  Code: -32099 
[Error - 1:27:33 PM] LLM VS Code client: couldn't create connection to server.
  Message: Cannot call write after a stream was destroyed
  Code: -32099 
[Error - 1:27:33 PM] Restarting server failed
  Message: Cannot call write after a stream was destroyed
  Code: -32099 
[Info  - 1:27:33 PM] Connection to server got closed. Server will restart.
true
/home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls: /lib/x86_64-linux-gnu/libm.so.6: version `GLIBC_2.29' not found (required by /home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls)
/home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.32' not found (required by /home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls)
/home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.33' not found (required by /home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls)
/home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.28' not found (required by /home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls)
/home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.34' not found (required by /home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls)
[Error - 1:27:33 PM] Server initialization failed.
  Message: Cannot call write after a stream was destroyed
  Code: -32099 
[Error - 1:27:33 PM] LLM VS Code client: couldn't create connection to server.
  Message: Cannot call write after a stream was destroyed
  Code: -32099 
[Error - 1:27:33 PM] Restarting server failed
  Message: Cannot call write after a stream was destroyed
  Code: -32099 
[Info  - 1:27:33 PM] Connection to server got closed. Server will restart.
true
/home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls: /lib/x86_64-linux-gnu/libm.so.6: version `GLIBC_2.29' not found (required by /home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls)
/home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.32' not found (required by /home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls)
/home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.33' not found (required by /home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls)
/home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.28' not found (required by /home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls)
/home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.34' not found (required by /home/toolkit/.vscode-server/extensions/huggingface.huggingface-vscode-0.1.5/server/llm-ls)
[Error - 1:27:33 PM] Server initialization failed.
  Message: Cannot call write after a stream was destroyed
  Code: -32099 
[Error - 1:27:33 PM] LLM VS Code client: couldn't create connection to server.
  Message: Cannot call write after a stream was destroyed
  Code: -32099 
[Error - 1:27:33 PM] Restarting server failed
  Message: Cannot call write after a stream was destroyed
  Code: -32099 
[Error - 1:27:33 PM] The LLM VS Code server crashed 5 times in the last 3 minutes. The server will not be restarted. See the output for more information.

PS: @McPatate I created #97 because I thought you wanted to have a separate issue, but now I'm not sure anymore ^^ feel free to close it or respond in #97 if you decide to keep it ;)

Same problem here. Error on using the extention on a remote server, where it used to work ok but doesn't now.

If it is of any use here are the info of the remote and local machines.

Remote machine:


Linux redacted 5.15.0-1045-oracle #51~20.04.1-Ubuntu SMP Fri Sep 22 14:26:20 UTC 2023 aarch64 aarch64 aarch64 GNU/Linux
Architecture:                       aarch64
CPU op-mode(s):                     32-bit, 64-bit
Byte Order:                         Little Endian
CPU(s):                             4
On-line CPU(s) list:                0-3
Thread(s) per core:                 1
Core(s) per socket:                 4
Socket(s):                          1
NUMA node(s):                       1
Vendor ID:                          ARM
Model:                              1
Model name:                         Neoverse-N1
Stepping:                           r3p1
BogoMIPS:                           50.00
NUMA node0 CPU(s):                  0-3
Vulnerability Gather data sampling: Not affected
Vulnerability Itlb multihit:        Not affected
Vulnerability L1tf:                 Not affected
Vulnerability Mds:                  Not affected
Vulnerability Meltdown:             Not affected
Vulnerability Mmio stale data:      Not affected
Vulnerability Retbleed:             Not affected
Vulnerability Spec rstack overflow: Not affected
Vulnerability Spec store bypass:    Mitigation; Speculative Store Bypass disabled via prctl
Vulnerability Spectre v1:           Mitigation; __user pointer sanitization
Vulnerability Spectre v2:           Mitigation; CSV2, BHB
Vulnerability Srbds:                Not affected
Vulnerability Tsx async abort:      Not affected
Flags:                              fp asimd evtstrm aes pmull sha1 sha2 crc32 atomics fphp asimdh
                                    p cpuid asimdrdm lrcpc dcpop asimddp ssbs

Local machine:

Linux redacted 5.15.0-86-generic #96-Ubuntu SMP Wed Sep 20 08:23:49 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux
Architecture:            x86_64
  CPU op-mode(s):        32-bit, 64-bit
  Address sizes:         39 bits physical, 48 bits virtual
  Byte Order:            Little Endian
CPU(s):                  4
  On-line CPU(s) list:   0-3
Vendor ID:               GenuineIntel
  Model name:            Intel(R) Core(TM) i5-4200U CPU @ 1.60GHz
    CPU family:          6
    Model:               69
    Thread(s) per core:  2
    Core(s) per socket:  2
    Socket(s):           1
    Stepping:            1
    CPU max MHz:         2600.0000
    CPU min MHz:         800.0000
    BogoMIPS:            4589.23
    Flags:               fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mc
                         a cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss 
                         ht tm pbe syscall nx pdpe1gb rdtscp lm constant_tsc arc
                         h_perfmon pebs bts rep_good nopl xtopology nonstop_tsc 
                         cpuid aperfmperf pni pclmulqdq dtes64 monitor ds_cpl vm
                         x est tm2 ssse3 sdbg fma cx16 xtpr pdcm pcid sse4_1 sse
                         4_2 movbe popcnt tsc_deadline_timer aes xsave avx f16c 
                         rdrand lahf_lm abm cpuid_fault epb invpcid_single pti s
                         sbd ibrs ibpb stibp tpr_shadow vnmi flexpriority ept vp
                         id ept_ad fsgsbase tsc_adjust bmi1 avx2 smep bmi2 erms 
                         invpcid xsaveopt dtherm ida arat pln pts md_clear flush
                         _l1d
Virtualization features: 
  Virtualization:        VT-x
Caches (sum of all):     
  L1d:                   64 KiB (2 instances)
  L1i:                   64 KiB (2 instances)
  L2:                    512 KiB (2 instances)
  L3:                    3 MiB (1 instance)
NUMA:                    
  NUMA node(s):          1
  NUMA node0 CPU(s):     0-3
Vulnerabilities:         
  Gather data sampling:  Not affected
  Itlb multihit:         KVM: Mitigation: VMX disabled
  L1tf:                  Mitigation; PTE Inversion; VMX conditional cache flushe
                         s, SMT vulnerable
  Mds:                   Mitigation; Clear CPU buffers; SMT vulnerable
  Meltdown:              Mitigation; PTI
  Mmio stale data:       Unknown: No mitigations
  Retbleed:              Not affected
  Spec rstack overflow:  Not affected
  Spec store bypass:     Mitigation; Speculative Store Bypass disabled via prctl
                          and seccomp
  Spectre v1:            Mitigation; usercopy/swapgs barriers and __user pointer
                          sanitization
  Spectre v2:            Mitigation; Retpolines, IBPB conditional, IBRS_FW, STIB
                         P conditional, RSB filling, PBRSB-eIBRS Not affected
  Srbds:                 Mitigation; Microcode
  Tsx async abort:       Not affected

For me, the issue seems to be llm-ls running on a mac after rebooting. I tracked it down to the rust Instant library and was able to create a llm-ls binary that now works for me. I opened an issue on the llm-ls repo describing the issue and my interim solution huggingface/llm-ls#37

I have very similar logs as @NicolasAG
Interestingly this only happens when running a dev container, and not when running locally.
Unfortunately I took quite a lot of pain to move all my workflow to dev containers for portability and this breaks that so its not usable for me right now. Would love to help debug.

This issue is stale because it has been open for 30 days with no activity.

Additional details:

  • on local env: v0.1.3 works but v0.1.4 and up do not work
  • on remote env: v0.0.38 works but v1.0.0 and up do not work

This issue is stale because it has been open for 30 days with no activity.

Is there any fix planned for this?

I tried to downgrade the extension to the reported working version and now I am getting the following error
data did not match any variant of untagged enum TokenizerConfig

image

This issue is stale because it has been open for 30 days with no activity.

Is there any solution for client not working issue for local environment?

The client is not running issues on the Ubuntu local environment

This issue is stale because it has been open for 30 days with no activity.

Hi, I have the same issue locally on Windows:
`2024-05-31 17:36:57.256 [info] thread 'main' panicked at crates\llm-ls\src\main.rs:772:18:
instant to be in bounds
stack backtrace:

2024-05-31 17:36:57.272 [info] note: Some details are omitted, run with RUST_BACKTRACE=full for a verbose backtrace.

2024-05-31 17:36:57.300 [info] [Error - 17:36:57] Server initialization failed.
2024-05-31 17:36:57.300 [info] Message: Pending response rejected since connection got disposed
Code: -32097
2024-05-31 17:36:57.300 [info] [Info - 17:36:57] Connection to server got closed. Server will restart.
2024-05-31 17:36:57.300 [info] true
2024-05-31 17:36:57.300 [info] [Error - 17:36:57] LLM VS Code client: couldn't create connection to server.
2024-05-31 17:36:57.300 [info] Message: Pending response rejected since connection got disposed
Code: -32097
2024-05-31 17:36:57.301 [info] [Error - 17:36:57] Server process exited with code 101.
2024-05-31 17:36:57.364 [info] thread 'main' panicked at crates\llm-ls\src\main.rs:772:18:
instant to be in bounds
stack backtrace:

2024-05-31 17:36:57.372 [info] note: Some details are omitted, run with RUST_BACKTRACE=full for a verbose backtrace.

2024-05-31 17:36:57.375 [info] [Error - 17:36:57] Server initialization failed.
2024-05-31 17:36:57.375 [info] Message: Pending response rejected since connection got disposed
Code: -32097
2024-05-31 17:36:57.375 [info] [Info - 17:36:57] Connection to server got closed. Server will restart.
2024-05-31 17:36:57.375 [info] true
2024-05-31 17:36:57.375 [info] [Error - 17:36:57] LLM VS Code client: couldn't create connection to server.
2024-05-31 17:36:57.375 [info] Message: Pending response rejected since connection got disposed
Code: -32097
2024-05-31 17:36:57.389 [info] [Error - 17:36:57] Server process exited with code 101.
2024-05-31 17:36:57.411 [info] thread 'main' panicked at crates\llm-ls\src\main.rs:772:18:
instant to be in bounds
stack backtrace:

2024-05-31 17:36:57.417 [info] note: Some details are omitted, run with RUST_BACKTRACE=full for a verbose backtrace.

2024-05-31 17:36:57.420 [info] [Error - 17:36:57] Server initialization failed.
2024-05-31 17:36:57.420 [info] Message: Pending response rejected since connection got disposed
Code: -32097
2024-05-31 17:36:57.420 [info] [Info - 17:36:57] Connection to server got closed. Server will restart.
2024-05-31 17:36:57.420 [info] true
2024-05-31 17:36:57.420 [info] [Error - 17:36:57] LLM VS Code client: couldn't create connection to server.
2024-05-31 17:36:57.420 [info] Message: Pending response rejected since connection got disposed
Code: -32097
2024-05-31 17:36:57.420 [info] [Error - 17:36:57] Restarting server failed
2024-05-31 17:36:57.420 [info] Message: Pending response rejected since connection got disposed
Code: -32097
2024-05-31 17:36:57.434 [info] [Error - 17:36:57] Server process exited with code 101.
2024-05-31 17:36:57.455 [info] thread 'main' panicked at crates\llm-ls\src\main.rs:772:18:
instant to be in bounds
stack backtrace:

2024-05-31 17:36:57.461 [info] note: Some details are omitted, run with RUST_BACKTRACE=full for a verbose backtrace.

2024-05-31 17:36:57.464 [info] [Error - 17:36:57] Server initialization failed.
2024-05-31 17:36:57.464 [info] Message: Pending response rejected since connection got disposed
Code: -32097
2024-05-31 17:36:57.464 [info] [Info - 17:36:57] Connection to server got closed. Server will restart.
2024-05-31 17:36:57.464 [info] true
2024-05-31 17:36:57.464 [info] [Error - 17:36:57] LLM VS Code client: couldn't create connection to server.
2024-05-31 17:36:57.464 [info] Message: Pending response rejected since connection got disposed
Code: -32097
2024-05-31 17:36:57.464 [info] [Error - 17:36:57] Restarting server failed
2024-05-31 17:36:57.464 [info] Message: Pending response rejected since connection got disposed
Code: -32097
2024-05-31 17:36:57.478 [info] [Error - 17:36:57] Server process exited with code 101.
2024-05-31 17:36:57.499 [info] thread 'main' panicked at crates\llm-ls\src\main.rs:772:18:
instant to be in bounds
stack backtrace:

2024-05-31 17:36:58.570 [info] [Error - 17:36:58] Server process exited with code 101.
2024-05-31 17:36:58.571 [info] note: Some details are omitted, run with RUST_BACKTRACE=full for a verbose backtrace.

2024-05-31 17:36:58.571 [info] [Error - 17:36:58] Server initialization failed.
2024-05-31 17:36:58.571 [info] Message: Pending response rejected since connection got disposed
Code: -32097
2024-05-31 17:36:58.571 [info] [Error - 17:36:58] The LLM VS Code server crashed 5 times in the last 3 minutes. The server will not be restarted. See the output for more information.
2024-05-31 17:36:58.571 [info] [Error - 17:36:58] LLM VS Code client: couldn't create connection to server.
2024-05-31 17:36:58.571 [info] Message: Pending response rejected since connection got disposed
Code: -32097
2024-05-31 17:36:58.572 [info] [Error - 17:36:58] Restarting server failed
2024-05-31 17:36:58.572 [info] Message: Pending response rejected since connection got disposed
Code: -32097 `

La version de VScode:
Version : 1.89.1 (user setup)
Validation : dc96b837cf6bb4af9cd736aa3af08cf8279f7685
Date : 2024-05-07T05:13:33.891Z
Electron : 28.2.8
ElectronBuildId : 27744544
Chromium : 120.0.6099.291
Node.js : 18.18.2
V8 : 12.0.267.19-electron.0
Système d’exploitation : Windows_NT x64 10.0.22631

Hi, I have the same issue locally on Windows: `2024-05-31 17:36:57.256 [info] thread 'main' panicked at crates\llm-ls\src\main.rs:772:18: instant to be in bounds stack backtrace:

Hi, I have activated the extension also in WSL: "WSL : Ubuntu-22.04'"
I don't get error messages, but I don't see any code completion either.

I'm also getting this error on v0.2.2 (and older versions) on Windows.
Is there any update to this issue?
Btw. it used to work a week ago, and now it just stop working.

This issue is stale because it has been open for 30 days with no activity.