How many epochs during finetuning ?
phongnhhn92 opened this issue · 4 comments
Hello sirs, I have one question. I can not find the information related to the number of epochs you have used to finetuning flowers dataset with NASnet. I tried to read the code but there are no information either. Can you clarify it ?
Hi @phongnhhn92 ,
There's no need to fix the epoch, you can stop the training when the the validation accuracy stops improve. Usually 30 epoch to 90 epoch will be fine depends on your learning rate and decay policy.
With regards!
Yeephycho
Thanks for your answer ! But there is one thing I dont understand. Consuming that i am using the original config for finetuning (batch size 32, optimizer rmsprop) and I stop my training at step 50k, then how to know how many epochs did I perform finetuning ? Sorry for my silly question if it is.
Training_step * batch_size / total number of training image = epoch
Thanks for your help !