Yolov5 change silu to relu accuracy problem after exporting to onnx.
maro-jeon opened this issue · 3 comments
maro-jeon commented
Hi,
I saw your study about changing activation function from silu to relu.
The score shows that ReLU mAP is 55.7 (swish pretrained).
so I just changed swish activation to relu activation without any fine-tuning process.
Then, the result of accuracy was zero.
Can you describe more about how to achieve the score 55.7 (swish pretrained) for reproducing the score.
Thanks,
Maro JEON
Tyler-D commented
You have to retrain the model.
maro-jeon commented
Thanks, you just fine tune ? or train many epochs ?
Tyler-D commented
Train for 300 epochs.