Error when using a model converted from Safetensors to ONNX
Closed this issue · 2 comments
I converted .safetensors files distributed by HuggingFace and Civitai to .onnx files with the following programs.
https://github.com/Amblyopius/Stable-Diffusion-ONNX-FP16
When I used the converted model and ran this program, the error appeared and did not work.
Microsoft.ML.OnnxRuntime.OnnxRuntimeException: '[ErrorCode:InvalidArgument] Tensor element data type discovered: Int64 metadata expected: Float'
I'm using a branch of DirectML
This is an issue with the input data type vs what the model is expecting for a data type. The model has both Float and optimized FP16 versions. You can open up the model to see the expected input types in Netron. The branch you forked is the Int64 branch. The main branch is Float. You can use any branch and update the EP in the config and onnx runtime packages on any of the branches to make it work with the EP of your choice.
When I checked the model converted from safetensors on Netron, the type of "timestep" in unet.onnx was Float32.
So when I changed the unet.cs as follows, the error disappeared, and I was able to generate an image using that model.
public static List CreateUnetModelInputFloat(Tensor encoderHiddenStates, Tensor sample, float timeStep)
{
var input = new List {
NamedOnnxValue.CreateFromTensor("encoder_hidden_states", encoderHiddenStates),
NamedOnnxValue.CreateFromTensor("sample", sample),
NamedOnnxValue.CreateFromTensor("timestep", new DenseTensor(new float[] { timeStep }, new int[] { 1 }))
};
return input;
}
Thank you