JosephChenHub/CenterNet-TensorRT

Unsupported ONNX data type: UINT8 (2)

Closed this issue · 1 comments

I follow your steps, but I got an error when convert onnx to trt :

Unsupported ONNX data type: UINT8 (2)
While parsing node number 108 [Cast -> "292"]:

how to fix this error?

thank you

Hi, I am working on this onnx model too.
I met the the same error as you metioned.
Here's how I solve the UINT8 type error.

import onnx_graphsurgeon as gs
import numpy as np
import onnx

graph = gs.import_onnx(onnx.load("ctdet-resdcn18.onnx"))

target_node = [node for node in graph.nodes if node.op=='Cast'][0]
target_node.attrs['to'] = 6

onnx.save(gs.export_onnx(graph), "CenterNet_trt.onnx")

If module onnx_graphsurgeon is not found in python.

pip install nvidia-pyindex
pip install onnx-graphsurgeon

According to netron
image

The first Cast layer is UINT8 and second is FLOAT.
So I only change the first Cast layer attribute type.

The attributes data type you can refer to :
https://github.com/onnx/onnx/blob/master/onnx/onnx.in.proto#L449-L478

But this is currently how to solve the Unsupported TYPE issue.
I Haven't test result yes.
Hope it helps.

Thank you.