stochasticai/x-stable-diffusion

tensorrt conversion fails

harishprabhala opened this issue ยท 11 comments

Hi. I am trying to convert my onnx file to Tensorrt using the conver_unet_to_tensorrt.py. But i get the below error

Traceback (most recent call last):
  File "convert_unet_to_tensorrt.py", line 52, in <module>
    convert(args)
  File "convert_unet_to_tensorrt.py", line 47, in convert
    f.write(serialized_engine)
TypeError: a bytes-like object is required, not 'NoneType' 

Except for the onnx file mentioned in the library, none of the other onnx files seem to convert. Any fix? Thanks.

Hi @harishprabhala, are you using Tensorrt 8.4.2.2.4? Our code requires TensorRT 8.4.2.2.4 to run.

I am using 8.4.1.5. I'll update it to the new version and try again. Will keep you posted.

I have tried it in 8.4.2 and I get the following error in demo.py

WhatsApp Image 2022-11-10 at 5 40 56 PM

If I comment out the line 144, the denoising isn't happening and this is the output of the demo.py output

WhatsApp Image 2022-11-10 at 4 23 11 PM

@harishprabhala It seems that the conversion on your environment did not work well. How about trying on google colab : https://github.com/stochasticai/x-stable-diffusion/blob/main/TensorRT/Notebook.ipynb

@harishprabhala It seems that the conversion on your environment did work well. How about trying on google colab : https://github.com/stochasticai/x-stable-diffusion/blob/main/TensorRT/Notebook.ipynb

Conversion "did" work well or "didn't"? :) So, if the conversion went well, why are the predictions turning out this way? Re Colab, how will it be different? Colab has its own set of issues. The Git clone doesn't work properly and there's no terminal access. Any way we can fix it in a local environment? I am testing it on a 2080Ti GPU.

We did not test on 2080ti GPU. We tested on a100 and T4 gpu. Google colab has t4 gpu and thus we suggest testing on google colab or on machines which have a100 or T4 gpu.

This is the output now on T4. This is becoming impossible ๐Ÿ˜…

download

Hi @harishprabhala , sorry for late response. We have just tested TensorRT code again on Google Colab and it works without any issue. Our tested notebook here: link. If you are still interested in, please test it again and let us know if any issue. Thank you.

@Toan-Do The output on my local T4 is like this
image_generated

I also ran it on Google Colab , but it says the following error.
NotImplementedError: A UTF-8 locale is required. Got ANSI_X3.4-1968

@ClementCJ On Google Colab, after converting model, you should restart the runtime and run start running from this line: https://colab.research.google.com/drive/1MAwk-DAngujQTaYh1eFZoOuocOhiKsge?usp=sharing#scrollTo=hLt_F8vpVGjl