about clip layer
wuzhiyang2016 opened this issue · 7 comments
hello, when i use quantize training to train a quantized model, the training code will insert clip layer to some layers, there is some problems: 1. for original float model struct: conv + bn + relu + avepooling, it will convert to conv + bn + clip + avepooling + clip, but when use model import tool to convert the model, the tool's code(you can see function code in tidl_mergeClipLayer()) doesn't merge clip layer which after avepooling layer, and raised error "...the model will not work "
- the problem happened the same as datalayer + clip
so, users should change the code in tidl_mergeClipLayer() ? or there are something wrong with model structure ?
Hi,
In pytorch-jacinto-ai-devkit:
in the file modules/pytorch_jacinto_ai/xnn/quantize/quant_graph_module.py
you can see the lines:
self.quantize_out_blocks = (torch.nn.ReLU, torch.nn.ReLU6, torch.nn.Hardtanh, layers.QAct, layers.PAct2,
layers.AddBlock, layers.CatBlock, layers.MultBlock, torch.nn.MaxPool2d, torch.nn.AvgPool2d)
Please try after removing the torch.nn.AvgPool2d from that list.
Let us know if it works.
ok, i will try that, how to dealwith dataLayer's clip layer ? if a sturct : dataLayer + conv , it will convert to dataLayer + clip +conv, this clip will not be merged too
Are you facing any issue with that clip being there?
I noticed that you said that there was an issue with that clip being there. (We are not facing that issue, but in our case it was dataLayer+BN+Clip because TIDL inserted a BN layer due to inDataNorm)
To avoid that clip, do the following:
In pytorch-jacinto-ai-devkit:
in the file modules/pytorch_jacinto_ai/xnn/quantize/quant_graph_module.py
in the function: _analyse_connections_op
you can see the line:
quantize_in = utils.is_conv_deconv_linear(module) and not is_input_quantized and
not is_input_ignored and is_first_module
Please change it to:
quantize_in = False
Let us know how it goes.
very thanks , i'll try that
Hi @wuzhiyang2016, Which version of TIDL did you get these errors in?
the folder name is tidl_j7_01_00_00_00 @mathmanu