Uncaught Error: Can't create a session for remove-background-client demo
Closed this issue · 2 comments
System Info
Transformers.js version: 2.15.0
Browser: Chrome
Operating system: Windows 10 (Version 23H2) WSL Ubuntu 22.04
npm: 9.5.0
node: v19.7.0
Environment/Platform
- Website/web-app
- Browser extension
- Server-side (e.g., Node.js, Deno, Bun)
- Desktop app (e.g., Electron)
- Other (e.g., VSCode extension)
Description
Got error in chrome console:
d84ec25b-321a-4d56-8c6d-71d6111285c9:6 Failed to load model because protobuf parsing failed.
lt @ d84ec25b-321a-4d56-8c6d-71d6111285c9:6
P @ d84ec25b-321a-4d56-8c6d-71d6111285c9:6
$func11504 @ ort-wasm-simd.wasm:0x82c2bc
$func2149 @ ort-wasm-simd.wasm:0x16396e
$func584 @ ort-wasm-simd.wasm:0x48a63
$func11427 @ ort-wasm-simd.wasm:0x829582
$func4164 @ ort-wasm-simd.wasm:0x339b6f
$func4160 @ ort-wasm-simd.wasm:0x339aff
j @ d84ec25b-321a-4d56-8c6d-71d6111285c9:6
$func356 @ ort-wasm-simd.wasm:0x2e215
j @ d84ec25b-321a-4d56-8c6d-71d6111285c9:6
$func339 @ ort-wasm-simd.wasm:0x28e06
$Ra @ ort-wasm-simd.wasm:0x6ebffb
e._OrtCreateSession @ d84ec25b-321a-4d56-8c6d-71d6111285c9:6
e.createSessionFinalize @ d84ec25b-321a-4d56-8c6d-71d6111285c9:6
e.createSession @ d84ec25b-321a-4d56-8c6d-71d6111285c9:6
self.onmessage @ d84ec25b-321a-4d56-8c6d-71d6111285c9:6
localhost/:1 Uncaught Error: Can't create a session
at e.createSessionFinalize (blob:http://localhost:5173/d84ec25b-321a-4d56-8c6d-71d6111285c9:6:62451)
at e.createSession (blob:http://localhost:5173/d84ec25b-321a-4d56-8c6d-71d6111285c9:6:63049)
at self.onmessage (blob:http://localhost:5173/d84ec25b-321a-4d56-8c6d-71d6111285c9:6:73231)
Reproduction
- Download ZIP of repo.
- Copy files of example remove-background-client
npm install
- Download onnx file from
https://huggingface.co/briaai/RMBG-1.4/resolve/main/onnx/model.onnx?download=true
and put it inside/mnt/d/16-LLM-Cache/models_for_transformers.js/briaai/RMBG-1.4/onnx
- Modify the front part of
main.js
to load local model. As follows:
import './style.css';
import { AutoModel, AutoProcessor, env, RawImage } from '@xenova/transformers';
// Since we will download the model from the Hugging Face Hub, we can skip the local model check
env.allowLocalModels = true;
env.remoteHost = 'https://hf-mirror.com';
// // Specify a custom location for models (defaults to '/models/').
env.localModelPath = '/mnt/d/16-LLM-Cache/models_for_transformers.js/';
// Disable the loading of remote models from the Hugging Face Hub:
env.allowRemoteModels = false;
// Proxy the WASM backend to prevent the UI from freezing
env.backends.onnx.wasm.proxy = true;
// Constants
const EXAMPLE_URL = 'https://images.pexels.com/photos/5965592/pexels-photo-5965592.jpeg?auto=compress&cs=tinysrgb&w=1024';
// Reference the elements that we will need
const status = document.getElementById('status');
const fileUpload = document.getElementById('upload');
const imageContainer = document.getElementById('container');
const example = document.getElementById('example');
// Load model and processor
status.textContent = 'Loading model...';
const model = await AutoModel.from_pretrained('briaai/RMBG-1.4', {
// Do not require config.json to be present in the repository
config: { model_type: 'custom' },
quantized: false,
local_files_only: true
});
Run npm run dev
Failed to load model because protobuf parsing failed.
Can you check the .onnx file is being loaded? My first instinct is that the URL doesn't resolve to the correct file. Also note that in the browser, you are not able to access files with the normal path (e.g., /mnt/...
). You should move this to your "public" folder and then set the localModelPath accordingly.
@xenova I got it. I need to create a public
folder, then relative path looks like remove-background-client/public/models/briaai/RMBG-1.4/onnx/model.onnx
.
Now it works. Thank you very much.