huggingface/llm-vscode

raise OS ERROR 123 on VSCode(windows)

bonuschild opened this issue · 5 comments

  • I've set the endpoint to https://github.com/LucienShui/huggingface-vscode-endpoint-server, but then every moment I press key it raise "OS ERROR 123": "文件名、目录名或卷标语法不正确”。

  • However, if I use curl to visit the API url it give correct response. So what is the real problem? I can not use the `Phind-CodeLLama-34B-V2" self-hosed model to help me with coding.

please take a look and thx!

BTW, I set python interpreter to the env created by miniconda(anaconda). Will that make a problem?

BTW, I set python interpreter to the env created by miniconda(anaconda). Will that make a problem?

I've open a main.cpp and write some code. The same error occurs so may be not the python interpreter problem :)

I figure it out that it is because the path \ cause mis syntax meaning. I manually change it in settings.json and it works. Think this is a bug on Windows vesion plugin of llm-vscode.

Hi @bonuschild, could you give more details on what the problem is and how you solved it?

I figure it out that it is because the path \ cause mis syntax meaning.

What setting parameter are you talking about?

Hi @bonuschild, could you give more details on what the problem is and how you solved it?

I figure it out that it is because the path \ cause mis syntax meaning.

What setting parameter are you talking about?

Every system path slash in windows should change from \ to \\, e.g.:
image