twinnydotdev/twinny

Configured providers but twinny not sending any requests to provider.

Opened this issue · 7 comments

Describe the bug
I have setup the following providers and I checked with curl that /api/generate endpoint on http://duodesk.duo:11434 works, the extension shows loading circle but is not sending any requests. Also tried setting Ollama Hostname setting to duodesk.duo, but no luck.

To Reproduce
Just added the providers I have attached.

Expected behavior
Should work with the providers I have I think?

Screenshots
image

Logging
Logging is enabled but not sure where am I supposed to see the logs, checked Output tab but there is no entry for twinny.

API Provider
Ollama running at http://duodesk.duo:11434 in local network.

Chat or Auto Complete?
Both

Model Name
codellama:7b-code

Desktop (please complete the following information):

  • OS: Windows
  • Version: 11

Additional context

I was looking at other issues and saw where I could see the logs, the log is as follows:
image

Looks like hostname is set incorrectly, how do I change it?

Maybe try a restart? The settings look correct to me. Also in the extension settings change the Ollama settings too. Click the cog in the extension header, there are some api settings for ollama in there too.

How can I check some logs? Because I have the same problem on Windows, I'm running VS Code from WSL:Ubuntu

  1. Install Ollama and check if is running at http://localhost:11434

imagen

  1. Install the models
    imagen

  2. Install and configure the vscode extension
    Settings
    imagen

Providers
imagen

  1. Test the chat
    imagen

Got this message from vs code dev tool
imagen

But the loading spinner keep there, any help? Also try 127.0.0.1 as host but same results.

There is an issue with Twinny and WSL connected VSCode windows.

Continue extension works, but I can't get Twinny to work. Let me know if there's a way for me to help debug this.

I've also tried setting the host value to 127.0.0.1, localhost, and the default 0.0.0.0. Restarted Ollama, VSCode, my computer, but Twinny does not even reach the ollama server. I do not have this issue when I run Twinny from within a VSCode instance that is running without any remote container connectivity (in my case WSL2 specifically).

Relevant console logs(?):

ERR [Extension Host] Fetch error: TypeError: fetch failed
	at node:internal/deps/undici/undici:12345:11
	at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
	at async t.streamResponse (/home/user/.vscode-server/extensions/rjmacarthy.twinny-3.11.45/out/index.js:2:138539)
console.ts:137 [Extension Host] Fetch error: TypeError: fetch failed
	at node:internal/deps/undici/undici:12345:11
	at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
	at async t.streamResponse (/home/user/.vscode-server/extensions/rjmacarthy.twinny-3.11.45/out/index.js:2:138539)
y @ console.ts:137

I have Twinny 3.12.0 on Linux. I'm running ollama with mistral-nemo and stable-code.
Curling the ollama works fine but Twinny just keeps spinning the circle, like in the screenshots above.
My project is running in a docker container, but that should not affect Twinny.

I do get a similar error message as above.

ERR [Extension Host] Fetch error: TypeError: fetch failed at node:internal/deps/undici/undici:12345:11 at process.processTicksAndRejections (node:internal/process/task_queues:95:5) at async streamResponse (/root/.vscode-server/extensions/rjmacarthy.twinny-3.12.0/out/index.js:20901:22)

Any suggestions on how to fix this?

i also just noticed this, i am using vscodium.

tbh i dont think this was always the case so i will update when i figure it out, hopefully.

https://stackoverflow.com/questions/70590704/fetching-in-a-vscode-extension-node-fetch-and-nodehttp-issues

Judging from these two issues:

Cannot find module 'node:http` on AWS Lambda v14 and
Problem with importing node-fetch v3.0.0 and node v16.5.0

it looks like the upgrade from node-fetch v3 to v3.1 was "troublesome" and resulted in the error you are seeing for some.

A few users are downgrading to v2 (or you might try v3 rather than v3.1 which you are using).

Uninstall node-fetch and npm install -s node-fetch@2