Cannot use local model, Z:/AI/models/model.onnx
Opened this issue · 2 comments
I'm trying to use
const classifierSession = await DiffusionPipeline.fromPretrained('C:\\users\\johna\\downloads\\model.onnx');
but it throws the error:
`Uncaught HubApiError Error: Api error with status 404. Request ID: Root=1-659715a0-16b3db9344fef60e2f28f710, url: https://huggingface.co/C:/users/johna/downloads/model.onnx/resolve/main/model_index.json
at createApiError (z:\AI\node_modules@huggingface\hub\dist\index.mjs:27:17)
at downloadFile (z:\AI\node_modules@huggingface\hub\dist\index.mjs:708:17)
at processTicksAndRejections (internal/process/task_queues:95:5)
--- await ---
at processTicksAndRejections (internal/process/task_queues:95:5)
--- await ---
at runMainESM (internal/modules/run_main:55:21)
at executeUserEntryPoint (internal/modules/run_main:78:5)
at (internal/main/run_main_module:23:47)
index.mjs:27
Process exited with code 1
`
how can I use a local model instead of attempting to reach an online one?
Hi. You need to pass model directory that has text encoder, unet, tokenizer config and other files like here https://huggingface.co/aislamov/stable-diffusion-2-1-base-onnx/tree/main
I haven't been able to make what you describe work @dakenf - it always treats the path as a huggingface model name and tries to download the path rather than access it via filesystem.
const pipeline = await StableDiffusionPipeline.fromPretrained('./models/stable-diffusion-2-1-base-onnx');
const pipeline = await StableDiffusionPipeline.fromPretrained('models/stable-diffusion-2-1-base-onnx');
const pipeline = await StableDiffusionPipeline.fromPretrained('./models/stable-diffusion-2-1-base-onnx/');
const pipeline = await StableDiffusionPipeline.fromPretrained('/usr/src/app/models/stable-diffusion-2-1-base-onnx');
None of these work as expected.