twinnydotdev/twinny

TypeError while attempting to use FIM from remote Open WebUI server

i-ate-a-vm opened this issue · 4 comments

Describe the bug
While attempting to use a remote Open WebUI server for FIM, I am experiencing a TypeError.

To Reproduce
Steps to reproduce the behavior:

  1. Install Twinny in VS Code
  2. Set up a remote Open WebUI server with Ollama
  3. Install the FIM model you want to use
  4. Set up a provider in Twinny using HTTPS, an Open WebUI API key, and the Open WebUI server as the API host

Expected behavior
I expect the code autocompletion to be generated on the Open WebUI server successfully, and for the resulting code to be correctly inserted into the code file.

Logging
I was able to find this error in the Extension logs:

***Twinny Stream Debug***
Streaming response from <my OpenWebUI hostname>:443.
Request body:
{
  "model": "codellama:7b-code",
  "prompt": "<my code>",
  "stream": true,
  "keep_alive": "5m",
  "options": {
    "temperature": 0.2,
    "num_predict": 500
  }
}


Request options:
{
  "hostname": "<my OpenWebUI hostname>",
  "port": 443,
  "path": "/ollama/api/generate",
  "protocol": "https",
  "method": "POST",
  "headers": {
    "Content-Type": "application/json",
    "Authorization": "Bearer <OpenWebUI API key>"
  }
}

ERR [Extension Host] TypeError: Cannot read properties of undefined (reading 'length')
	at t.getFimDataFromProvider (/Users/jameswestbrook/.vscode/extensions/rjmacarthy.twinny-3.11.28/out/index.js:2:150649)
	at t.CompletionProvider.onData (/Users/jameswestbrook/.vscode/extensions/rjmacarthy.twinny-3.11.28/out/index.js:2:125345)
	at onData (/Users/jameswestbrook/.vscode/extensions/rjmacarthy.twinny-3.11.28/out/index.js:2:124739)
	at Object.transform (/Users/jameswestbrook/.vscode/extensions/rjmacarthy.twinny-3.11.28/out/index.js:2:139215)
	at ensureIsPromise (node:internal/webstreams/util:192:19)
	at transformStreamDefaultControllerPerformTransform (node:internal/webstreams/transformstream:509:18)
	at transformStreamDefaultSinkWriteAlgorithm (node:internal/webstreams/transformstream:559:10)
	at Object.write (node:internal/webstreams/transformstream:364:14)
	at ensureIsPromise (node:internal/webstreams/util:192:19)
	at writableStreamDefaultControllerProcessWrite (node:internal/webstreams/writablestream:1114:5)
	at writableStreamDefaultControllerAdvanceQueueIfNeeded (node:internal/webstreams/writablestream:1229:5)
	at writableStreamDefaultControllerWrite (node:internal/webstreams/writablestream:1103:3)
	at writableStreamDefaultWriterWrite (node:internal/webstreams/writablestream:993:3)
	at [kChunk] (node:internal/webstreams/readablestream:1404:28)
	at readableStreamFulfillReadRequest (node:internal/webstreams/readablestream:1996:24)
	at readableStreamDefaultControllerEnqueue (node:internal/webstreams/readablestream:2187:5)
	at transformStreamDefaultControllerEnqueue (node:internal/webstreams/transformstream:490:5)
	at TransformStreamDefaultController.enqueue (node:internal/webstreams/transformstream:301:5)
	at Object.transform (node:internal/webstreams/encoding:156:22)
	at ensureIsPromise (node:internal/webstreams/util:192:19)
	at transformStreamDefaultControllerPerformTransform (node:internal/webstreams/transformstream:509:18)
	at transformStreamDefaultSinkWriteAlgorithm (node:internal/webstreams/transformstream:559:10)
	at Object.write (node:internal/webstreams/transformstream:364:14)
	at ensureIsPromise (node:internal/webstreams/util:192:19)
	at writableStreamDefaultControllerProcessWrite (node:internal/webstreams/writablestream:1114:5)
	at writableStreamDefaultControllerAdvanceQueueIfNeeded (node:internal/webstreams/writablestream:1229:5)
	at writableStreamDefaultControllerWrite (node:internal/webstreams/writablestream:1103:3)
	at writableStreamDefaultWriterWrite (node:internal/webstreams/writablestream:993:3)
	at [kChunk] (node:internal/webstreams/readablestream:1404:28)
	at readableStreamFulfillReadRequest (node:internal/webstreams/readablestream:1996:24)
	at readableStreamDefaultControllerEnqueue (node:internal/webstreams/readablestream:2187:5)
	at transformStreamDefaultControllerEnqueue (node:internal/webstreams/transformstream:490:5)
	at TransformStreamDefaultController.enqueue (node:internal/webstreams/transformstream:301:5)
	at Object.identityTransformAlgorithm [as transform] (node:internal/deps/undici/undici:10452:22)
	at ensureIsPromise (node:internal/webstreams/util:192:19)
	at transformStreamDefaultControllerPerformTransform (node:internal/webstreams/transformstream:509:18)
	at transformStreamDefaultSinkWriteAlgorithm (node:internal/webstreams/transformstream:559:10)
	at Object.write (node:internal/webstreams/transformstream:364:14)
	at ensureIsPromise (node:internal/webstreams/util:192:19)
	at writableStreamDefaultControllerProcessWrite (node:internal/webstreams/writablestream:1114:5)
	at writableStreamDefaultControllerAdvanceQueueIfNeeded (node:internal/webstreams/writablestream:1229:5)
	at writableStreamDefaultControllerWrite (node:internal/webstreams/writablestream:1103:3)
	at writableStreamDefaultWriterWrite (node:internal/webstreams/writablestream:993:3)
	at [kChunk] (node:internal/webstreams/readablestream:1404:28)
	at readableStreamFulfillReadRequest (node:internal/webstreams/readablestream:1996:24)
	at readableStreamDefaultControllerEnqueue (node:internal/webstreams/readablestream:2187:5)
	at ReadableStreamDefaultController.enqueue (node:internal/webstreams/readablestream:1041:5)
	at fetchParams.controller.resume (node:internal/deps/undici/undici:10843:45)
	at process.processTicksAndRejections (node:internal/process/task_queues:95:5)

Desktop (please complete the following information):

  • OS: Mac M2, Sonoma 14.3
  • Browser: VS Code
  • Version: Latest version of Twinny, VS Code v1.88.1

Additional context
The Open WebUI server works successfully for chat, and I can see in its logs that requests are reaching the /api/generate route successfully, suggesting that it isn't a connectivity issue.

Hello, do you know what the correct response from ollama webui is? I am having issues running this API now since it has been updated, many thanks. Edit: Should be fixed in latest release.

Executive decision: Individual configuration issues will receive an answer and will be considered closed after 24 hours of no replies. Many thanks.

Hi @rjmacarthy, should this be fixed in the latest Twinny release or Ollama release? I pulled the latest Twinny version and still see the issue. I'd appreciate a pointer to the bug in Ollama you saw resolved if that was what you were referring to. Thanks so much!

EDIT, scratch that. I'm still seeing an error but it is a different one, so I'll open a separate issue.

Hey, the fim endpoint should be /ollama/api/generate if using open web UI. It was fixed here 60ee0fd and I tested it working. If still an issue please check your configuration.