Can't start processing scripts (export_onnx.py and export_trt.py)
Maelstrom2014 opened this issue · 9 comments
Hi! How to get this working on windows?
c:\ai\comfyui>.\python_embeded\python.exe c:\ai\comfyui\ComfyUI\custom_nodes\ComfyUI-Upscaler-Tensorrt\export_onnx.py
Total VRAM 16379 MB, total RAM 49134 MB
pytorch version: 2.3.0+cu121
xformers version: 0.0.26.post1
Set vram state to: NORMAL_VRAM
Device: cuda:0 NVIDIA GeForce RTX 4060 Ti : native
VAE dtype: torch.bfloat16
WARNING: comfy_extras.chainner_models is deprecated and has been replaced by the spandrel library.
Traceback (most recent call last):
File "c:\ai\comfyui\ComfyUI\custom_nodes\ComfyUI-Upscaler-Tensorrt\export_onnx.py", line 113, in <module>
torch.onnx.export(upscale_model,
File "c:\ai\comfyui\python_embeded\Lib\site-packages\torch\onnx\utils.py", line 516, in export
_export(
File "c:\ai\comfyui\python_embeded\Lib\site-packages\torch\onnx\utils.py", line 1589, in _export
with exporter_context(model, training, verbose):
File "contextlib.py", line 137, in __enter__
File "c:\ai\comfyui\python_embeded\Lib\site-packages\torch\onnx\utils.py", line 179, in exporter_context
with select_model_mode_for_export(
File "contextlib.py", line 137, in __enter__
File "c:\ai\comfyui\python_embeded\Lib\site-packages\torch\onnx\utils.py", line 140, in disable_apex_o2_state_dict_hook
for module in model.modules():
^^^^^^^^^^^^^
AttributeError: 'ImageModelDescriptor' object has no attribute 'modules'
and this
c:\ai\comfyui>.\python_embeded\python.exe c:\ai\comfyui\ComfyUI\custom_nodes\ComfyUI-Upscaler-Tensorrt\export_trt.py
Traceback (most recent call last):
File "c:\ai\comfyui\ComfyUI\custom_nodes\ComfyUI-Upscaler-Tensorrt\export_trt.py", line 3, in <module>
from utilities import Engine
ModuleNotFoundError: No module named 'utilities'
Might be related: #11
I found a soulution (another)
put in the begining if export_trt.py:
import os, sys
file_dir = os.path.dirname(__file__)
sys.path.append(file_dir)
Nice!
Nice!
export onnx still not working.
Yeah, because I had a node update my tensorrt none of this works now. Error Code: 6: The engine plan file is not compatible with this version of TensorRT, expecting library version 10.1.0.27 got 10.0.1.6, please rebuild.
I need to compile the .pth files into .onnx files then compile those into engines using the 10.1.0.27 I have now, but the script is just flat out no good. It finds everything alright but has internal errors.
Nice!
export onnx still not working.
I don't recommend exporting onnx models, because most models won't work and it'll require high ram, but the instructions are on the export_onnx.py
Yeah, because I had a node update my tensorrt none of this works now. Error Code: 6: The engine plan file is not compatible with this version of TensorRT, expecting library version 10.1.0.27 got 10.0.1.6, please rebuild.
I need to compile the .pth files into .onnx files then compile those into engines using the 10.1.0.27 I have now, but the script is just flat out no good. It finds everything alright but has internal errors.
you don't need to convert .pth to .onnx, just run export_trt.py with the new tensorrt version
Yeah, because I had a node update my tensorrt none of this works now. Error Code: 6: The engine plan file is not compatible with this version of TensorRT, expecting library version 10.1.0.27 got 10.0.1.6, please rebuild.
I need to compile the .pth files into .onnx files then compile those into engines using the 10.1.0.27 I have now, but the script is just flat out no good. It finds everything alright but has internal errors.you don't need to convert .pth to .onnx, just run export_trt.py with the new tensorrt version
I need to convert another .pth file.
Yeah, because I had a node update my tensorrt none of this works now. Error Code: 6: The engine plan file is not compatible with this version of TensorRT, expecting library version 10.1.0.27 got 10.0.1.6, please rebuild.
I need to compile the .pth files into .onnx files then compile those into engines using the 10.1.0.27 I have now, but the script is just flat out no good. It finds everything alright but has internal errors.you don't need to convert .pth to .onnx, just run export_trt.py with the new tensorrt version
Incorrect as I did that twice and that same error came up.
edit: I am just going to start over from scratch as that yolo world Zho Zho Zho is a know bad node only I did not know and it zapped me.
Now I do have a ton (7 gigs) of .pth files I would like to convert that should work.