Error: no available backend found. ERR: [wasm] RangeError: Out of memory
Opened this issue · 11 comments
System Info
@huggingface/transformers 3.0.0-alpha.19, Mobile Safari, iOS 17.6.1
Environment/Platform
- Website/web-app
- Browser extension
- Server-side (e.g., Node.js, Deno, Bun)
- Desktop app (e.g., Electron)
- Other (e.g., VSCode extension)
Description
Getting the error "Error: no available backend found. ERR: [wasm] RangeError: Out of memory" in mobile Safari when trying to load model Xenova/clip-vit-base-patch16. This same model loads (albeit a bit unstable) under transformers.js v2. This code does work on desktop Chrome on MacOS.
Reproduction
pipeline("zero-shot-image-classification", "Xenova/clip-vit-base-patch16")
May or may not be related, but https://huggingface.co/spaces/webml-community/segment-anything-webgpu doesn't seem to work on mobile Safari either, but looks like it's using an older version of v3. Just wondering if v3 just generally doesn't work on iOS.
WebGPU isn't officially supported by Safari yet. Maybe you can try with its technical preview: https://webkit.org/blog/14879/webgpu-now-available-for-testing-in-safari-technology-preview/.
I just tried that on Mobile Safari. Didn't seem to change anything. Same error.
So, with V3, are you saying WebGPU is the only engine available and it will not fall back to a CPU version if WebGPU is not available?
CPU is available in V3. auto-fallback has recently become an option, you have to set it to that though.
Set device
to wasm
in the inference settings.
So setting "device" to "wasm" will make the transformers.js code try WebGPU first and then if that fails it will fall back to CPU or are you saying that my code would need to create the fallback code?
I tried this but am still getting the "Out of memory" error. Since the error includes "wasm" in it, I'm guessing it was already doing the auto fall back to "wasm". Did something change in the wasm engine that would cause this since it was working in v2?
Try "auto". Wasm sets it to cpu only.
But on iOS will that fix the Out of memory problem since CPU is all that would be available?
After changing it to "auto" it doesn't seem to try and load the "wasm" version at all. Not even getting the out of memory error now so not really sure what it is doing, but not working either.
How is "auto" different than just not setting "device" to anything?
Looks like this may be an iOS Safari issue and only be fixed in iOS 18 which just came out so makes using v3 hard for me at the moment until that is more widely available. Odd though that this worked in iOS 17 with transformer.js V2.