ThanatosShinji/onnx-tool

Add value_infer Support for OP 'Pad', 'Resize' and 'InstanceNormalization'

hollyaoyaozi opened this issue · 9 comments

Hi,

When i ran shape_infer & profile for detic's onnx model, it reported that PadNode, ResizeNode and InstanceNormalizationNode`s value_infer are not implemented.
It would be appreciated that value_infer implemented for these OPs.

Thank you !

Is that a public ONNX model?

Is that a public ONNX model?

It is facebook/Detic's model and generated by someone: facebookresearch/Detic#113
The onnx file i used was downloaded from link: https://drive.google.com/file/d/1hYz19lZk4ugLrUGO0HIP9M2RbXs5A4O-/view?usp=sharing

Besides, it seems OP If is not implemented either, which resides in the onnx file downloaded from link above. I am afraid when shape_infer proceeds and arrives at If node, it will report error.
image

I've viewed the model myself. The biggest problem is the If operator which requires the support for the subgraph. Supporting one or some operators won't take much time. But the subgraph support is a long-term story. It requires architecture change to support an operator that has a small ONNX graph in its attribute data.

I've viewed the model myself. The biggest problem is the If operator which requires the support for the subgraph. Supporting one or some operators won't take much time. But the subgraph support is a long-term story. It requires architecture change to support an operator that has a small ONNX graph in its attribute data.

Got it, and thanks for your reply.

I come up with a solution and it may be a workaround to solve my problem considering current onnx-tool's architect does not support subgraph and 'If' node. I have viewed the model in netron.app that each 'If' node has at least a simple branch which only contains one OP, such as 'Identity', 'Squeeze' and 'Constant', which means I can fix which branch to be entered and implement 'If' node's value_infer by using corresponding OP in that branch according to node.name (also need use corresponding input as intensors of value_infer).

My main purpose is to profile this onnx model and get its FLOPS, and fixing these branches will not be a problem (computation in these 'If' nodes is small compared to the whole model). So I wonder if this method will work.

@hollyaoyaozi it's a good idea! You can have a try. And I have one suggestion: export backbone only. These ops contain If are most likely created by roi_heads. And I believe backbone should contain most FLOPs.

@hollyaoyaozi here is one workaround link for removing If operator. I hope it can help your case.

If node is not the only issue of this model. Another issue is that this model needs to compute the value K of TopK node, which is very slow to finish the compute and also useless for profiling. So I set the K to 1000 as the maximum value. Then I can get the profile table.
detic.txt

@hollyaoyaozi I've updated the example code here

@ThanatosShinji Thank you very much !