Client LLM VS Code: connection to server is erroring. Shutting down server.
IngoTB303 opened this issue · 4 comments
When starting vscode, I get the errors with v0.1.6:
write EPIPE
Client is not running and can't be stopped. It's current state is: starting
write EPIPE
Pending response rejected since connection got disposed
Client is not running and can't be stopped. It's current state is: starting
Client is not running and can't be stopped. It's current state is: startFailed
Client is not running and can't be stopped. It's current state is: startFailed
Client is not running and can't be stopped. It's current state is: startFailed
config:
endpoint: https://stack.dataportraits.org/overlap
template: bigcode/starcoder
context window: 8192
Could you paste the relevant sections of your settings.json
please?
Hi, of course, althoúgh it's working again today. Maybe it was a temporary access error?
"HuggingFaceCode.isFillMode": true,
"HuggingFaceCode.autoregressiveModeTemplate": "[prefix]",
"HuggingFaceCode.fillModeTemplate": "<PRE> [prefix] <SUF>[suffix] <MID>",
"HuggingFaceCode.temperature": 0.2,
"HuggingFaceCode.stopTokens": [
"<|endoftext|>",
"<EOT>"
],
"HuggingFaceCode.tokensToClear": [
"<MID>"
],
"HuggingFaceCode.configTemplate": "codellama/CodeLlama-13b-hf",
"HuggingFaceCode.modelIdOrEndpoint": "codellama/CodeLlama-13b-hf",
Aren't your settings out of date? v0.1.x
renamed everything
This issue is stale because it has been open for 30 days with no activity.