LLM VS Code client: couldn't create connection to server |llm-ls failed
raj2125 opened this issue · 6 comments
Hello - I'll provide some context on (1) What we are doing, (2) What changes we made (3) What issues we are facing.
-
(1) What we are doing? We are doing code completion pilot using llm-vscode , customize sign-in process to integrate with organization SSO (oauth) service. Also, leverage the llama model, hosted on-premise to help with code completion feature.
-
What changes we made? we added oauth-login & oauth-logout commands & handlers to integrate with our sso solution. Made changes to package.json file, llm.modelIdOrEndpoint config to point to our llm completions API endpoint.
We made above changes, packaged it to new extension & installed on our local machine(Macos- Ventura 13.6.2). -
What issues we are facing? Issues with integrating with llm-ls server. Below are few errors-
LLM-VSCODE window:
rejected promise not handled within 1 second: Launching server using command /../CodeCompletionPilot/llm-vscode/server/llm-ls failed. Error: spawn /../CodeCompletionPilot/llm-vscode/server/llm-ls ENOENT
New window: commands oauth-login is invoked in the command palette. I get below errors-
2023-12-20 12:21:53.063 [info] [Error - 12:21:52 PM] LLM VS Code client: couldn't create connection to server.
2023-12-20 12:21:53.063 [info] Launching server using command /../CodeCompletionPilot/llm-vscode/server/llm-ls failed. Error: spawn /../CodeCompletionPilot/llm-vscode/server/llm-ls ENOENT
2023-12-20 12:22:05.269 [info] [Error - 12:22:04 PM] LLM VS Code client: couldn't create connection to server.
2023-12-20 12:22:05.269 [info] Launching server using command /../CodeCompletionPilot/llm-vscode/server/llm-ls failed. Error: spawn /../CodeCompletionPilot/llm-vscode/server/llm-ls ENOENT
Appreciate any help/guidance in resolving above issues. Thank you.
@McPatate Appreciate someone comments on the above issue; it's blocker for us, guidance from HF team would help.
Hello @raj2125, how are you creating the vscode extension? It seems that you are not bundling the language server. I'd advise you check the github workflows for more detail on how I package the extension.
@McPatate Thanks for the response. I used below steps to packaging extension:
Packaging
Run npm run package
This script:
- compiles the extension source code
- downloads llm-ls from hugging face (https://github.com/huggingface/llm-ls/releases/download/0.4.0/llm-ls-aarch64-apple-darwin.gz)
- unzips, moves llm-ls, and gives it execute permissions
- calls the vsce package publish command to generate a local .vsix file
anything missing in the above steps?
Error details :
2023-12-21 09:59:20.999 [info] [Error - 9:59:19 AM] LLM VS Code client: couldn't create connection to server.
2023-12-21 09:59:20.999 [info] Launching server using command /Users/../llm-vscode/server/llm-ls failed. Error: spawn /Users/Z00B26W/Desktop/CodeCompletionPilot/llm-vscode/server/llm-ls ENOENT
2023-12-21 12:01:41.173 [info] [Error - 12:01:40 PM] LLM VS Code client: couldn't create connection to server.
2023-12-21 12:01:41.173 [info] Launching server using command /Users/../llm-vscode/server/llm-ls failed. Error: spawn /Users/../llm-vscode/server/llm-ls ENOENT
2023-12-21 12:48:06.847 [info] [Error - 12:48:06 PM] LLM VS Code client: couldn't create connection to server.
2023-12-21 12:48:06.848 [info] Launching server using command /Users/../llm-vscode/server/llm-ls failed. Error: spawn /Users/../llm-vscode/server/llm-ls ENOENT
Made some changes to release scripts, it's connecting to LS. Thanks.
Glad to hear that! Feel free to upstream your oauth code with a feature flag if there isn't anything too sensitive, I think other people could benefit from it.