ibaiGorordo/ONNX-MobileStereoNet

Convert to ONNX model

michaelnguyen11 opened this issue · 3 comments

Hi @ibaiGorordo ,

Thank you for your great work !.

I've tried to convert MobileStereoNet Pytorch model to ONNX model by using torch onnx export. However the ONNX model size is 10 times bigger than Pytorch model size. (i.e. 300Mbs, compare to 29Mbs).

The script which I used to convert to onnx model:

import torch
import onnx
from onnxsim import simplify
from models import MSNet2D

if __name__ == '__main__':

    # Load models
    maxdisp = 192

    model = MSNet2D(maxdisp)
    model.cuda()

    print('load model')
    model_path = 'weights/MSNet2D_SF_KITTI2015.ckpt'
    state_dict = torch.load(model_path)

    pretrained_dict = {key.replace("module.", ""): value for key, value in state_dict['model'].items()}
    model.load_state_dict(pretrained_dict)
    model.eval()
  
    W = 528
    H = 240
    onnx_file = f"mobileStereoNet_{H}x{W}.onnx"
    onnx_simp_file = f"mobileStereoNet_{H}x{W}_simp.onnx"
    
    x = torch.randn(1, 3, H, W).cuda()
    torch.onnx.export(model,
                                args=(x, x),
                                f=onnx_file,
                                opset_version=12)

    print("save model {} successfully".format(onnx_file))

    model_load = onnx.load(onnx_file)
    model_simp, check = simplify(model_load)
    onnx.save(model_simp, onnx_simp_file)

    print("save model {} successfully".format(onnx_simp_file))

Could you hint me how did you convert MobileStereoNet Pytorch to ONNX ?

Many thanks in advance !.

Regards,
Michael

Hi, I also have the same issue.

The Pytorch to ONNX is not the issue (which is the model I updated), the "problem" happens when you simplify the model, then the model suddenly grows to hundreds of MBs. Since the model needs to be downloaded, I decided to use the pre-simplify version. Also, the simplified version seems to be slower for some reason (Ref: https://twitter.com/PINTO03091/status/1464901764977618951).

For future reference (the models are not available yet as of today), it is probably better to check the models here:
https://github.com/PINTO0309/PINTO_model_zoo/tree/main/150_MobileStereoNet

The onnx-simplifier will unfold the tensor, i.e., constant unfolding. If you want to add 1 for a tensor, which size is 1x3x224x224, pytorch would just record the constant value and the tensor size while onnx-simplifier would unfold the constant value which makes a 1x3x224x224 tensor again, in which all the values are just 1. So the model size will be pretty big if there are many constant ops, like ConstantOfShape, Expand, Tile, in your model.

I haven't tested it, but using some of the tools in https://github.com/PINTO0309/simple-onnx-processing-tools might help with that (particularly scs4onnx)