JUGGHM/PENet_ICRA2021

Modify the backbone network

yuyu19970716 opened this issue · 3 comments

Hi!
I have a question for you!
Recently I changed PENet's backbone network ENet to attention-unet, and I only used part of Kitti's dataset. I felt that the training was a little slow, and then I modified the learning rate when training the backbone network so that the network could learn faster.
image
It's ok when I train ENet. (Blue is the training set, yellow is the validation set)
But when I used the trained backbone network for the second stage of training, the network experienced severe overfitting. I wonder if this has something to do with the learning rate?
image
We can see that cspn++ is trained very poorly. The validation set error is large.
What is the reason for this?
Below is the learning rate when I train ENet.
f5519c1a5763a78fb707e811817d022

image
The above is the result after training ENet
I am very much looking forward to your reply!

It seems reasonable. Since the stage2 training should be regarded as an parametric initialization step of the CSPN++ module, it is a normal case that error metrics after stage2 are no better than those after stage1. You could refer to previous issues for detailed discussion: issue 30 28 25.

Thank you very much for your prompt answer!
I will seriously consider your proposal!