Awesome work! Can you share your trained parameters for the neural network by the way?
guanchuwang opened this issue · 2 comments
guanchuwang commented
Awesome work! Can you share your trained parameters for the neural network by the way?
weiaicunzai commented
Sorry, I kinda just deleted the weight files once I got the evaluation result.You could try to set wamup epoch to 5 when training large networks like resnet, google, vgg, etc, other hyper parameters simply use the default ones would gives you the decent results. As for light weight networks:(shuffle net, mobilenet), you could set warmup epoch to 1 or 2.
guanchuwang commented
Thanks a lot!