huggingface/transformers.js

Failed when using custom model (onnx)

Closed this issue · 4 comments

System Info

  • transformer.js version: 3.0.2
  • node version: v18.16.1
  • browser: google chrome
  • machine: macbook m1

Environment/Platform

  • Website/web-app
  • Browser extension
  • Server-side (e.g., Node.js, Deno, Bun)
  • Desktop app (e.g., Electron)
  • Other (e.g., VSCode extension)

Description

Hi, I am trying to convert the model "distilbert-base-uncased-finetuned-sst-2-english" and use it in the browser extension.
Conversion seems successful, but I get an error when I try to use it in background.js. I was following "Use custom models". All the references are to the repo https://github.com/huggingface/transformers.js and its extension directory.

Error

Uncaught (in promise) Error: Can't create a session at e.createSessionFinalize (chrome-extension://kjjfglkckkakiohcfnhamlalgkecebof/background.js:15:450850) at e.createSession (chrome-extension://kjjfglkckkakiohcfnhamlalgkecebof/background.js:15:451448) at e.createSession (chrome-extension://kjjfglkckkakiohcfnhamlalgkecebof/background.js:15:443674) at e.OnnxruntimeWebAssemblySessionHandler.loadModel (chrome-extension://kjjfglkckkakiohcfnhamlalgkecebof/background.js:15:446568) at async Object.createSessionHandler (chrome-extension://kjjfglkckkakiohcfnhamlalgkecebof/background.js:15:156396) at async InferenceSession.create (chrome-extension://kjjfglkckkakiohcfnhamlalgkecebof/background.js:430:25) at async constructSession (chrome-extension://kjjfglkckkakiohcfnhamlalgkecebof/background.js:3456:16) at async Promise.all (index 1) at async DistilBertForSequenceClassification.from_pretrained (chrome-extension://kjjfglkckkakiohcfnhamlalgkecebof/background.js:4123:20) at async AutoModelForSequenceClassification.from_pretrained (chrome-extension://kjjfglkckkakiohcfnhamlalgkecebof/background.js:8849:20)

I also see a warning which I think it might be relevant:
D:/a/_work/1/s/onnxruntime/core/graph/model.cc:146 onnxruntime::Model::Model(ModelProto &&, const PathString &, const IOnnxRuntimeOpSchemaRegistryList *, const logging::Logger &, const ModelOptions &) Unsupported model IR version: 10, max supported IR version: 8`

`

Reproduction

  1. run python -m scripts.convert --quantize --model_id distilbert-base-uncased-finetuned-sst-2-english in scripts to obtain converted version in onnx
  2. move the folder "models" which contains distilbert-base-uncased-finetuned-sst-2-english file to extension/public
  3. follow the readme in extension directory (npm install > npm run build > register build folder in chrome://extensions/)
  4. open the service worker console
  5. trigger the error by making inference with the model (input text into the input box in popup)

Hi there 👋 can you export the model in an environment with the following dependency versions: https://github.com/huggingface/transformers.js/blob/main/scripts/requirements.txt

Hey, thanks for quick response.

Looks like this was the same environment I used in the beginning and encountered the error.

Nevertheless, I exported the model again, but still got the same error.

Again I notice the same issue with IR version: current environment of the script gives us IR version 10, and the extension's maximum supporting version is 8:

image

Here is a snapshot of package.json of examples/extension:

{
"name": "extension",
"version": "0.0.1",
"description": "Transformers.js | Sample browser extension",
"scripts": {
"build": "webpack",
"dev": "webpack --watch"
},
"type": "module",
"author": "Xenova",
"license": "MIT",
"devDependencies": {
"copy-webpack-plugin": "^11.0.0",
"html-webpack-plugin": "^5.5.1",
"webpack": "^5.79.0",
"webpack-cli": "^5.1.4"
},
"dependencies": {
"@xenova/transformers": "^2.0.0"
}
}

Do you think upgrading the version of onnxruntime-web can be solution? (I don't know how to do that properly)

This is not exactly the same problem, but I found similar one in regard to IR version incompatibility: microsoft/onnxruntime#16638

Ah looks like you're using an old version of transformers.js:

"@xenova/transformers": "^2.0.0"

Should be:

"@huggingface/transformers": "^3.0.2"