CaoWGG/TensorRT-CenterNet

Convert Hourglass to TensorRT

Opened this issue · 0 comments

Hi,

For a while, I'm trying to convert the Hourglass model of CenterNet to TensorRT through onnx format. Below, I attached a part of the script that I used to convert the Hourglass model to onnx


def hourglass_forward(self, x):
inter = self.pre(x)
ret = [ ]
for ind in range(self.nstack):
kp
, cnv_ = self.kps[ind], self.cnvs[ind]
kp = kp_(inter)
cnv = cnv_(kp)
out = [ ]
for head in self.heads:
layer = self.getattr(head)[ind]
y = layer(cnv)
ret.append(y)
if ind < self.nstack - 1:
inter = self.inters_[ind](inter) + self.cnvs_[ind](cnv)
inter = self.relu(inter)
inter = self.inters[ind](inter)
return ret
.
.
.
opt = opts().init()
opt.arch = 'hourglass_104'
opt.heads = OrderedDict([('hm', 80), ('reg', 2), ('wh', 2)])
opt.head_conv = 256 if 'hourglass' in opt.arch else 64
print(opt)
model = create_model(opt.arch, opt.heads, opt.head_conv)
model.forward = MethodType(forward[opt.arch.split('')[0]], model)
load_model(model, 'ctdet_coco_hg.pth')
model.eval()
model.cuda()
input = torch.zeros([1, 3, 512, 512]).cuda()
onnx.export(model, input, "ctdet_coco_hg.onnx", verbose=True,operator_export_type=OperatorExportTypes.ONNX)


It seems that there is some mistakes in my script but the conversion to onnx format done. However, when I tried to apply the first step of TensorRT to generate the engine, it shows these errors:


WARNING: ONNX model has a newer ir_version (0.0.4) than this parser was built against (0.0.3).
WARNING: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32.
Successfully casted down to INT32.
While parsing node number 208 [Gather]:
3
ERROR: ../onnx2trt_utils.hpp:335 In function convert_axis:
[8] Assertion failed: axis >= 0 && axis < nbDims
ERROR: failed to parse onnx file


There is any suggestion about these errors?