oarriaga/paz

Questions about the size of the trained model file(Emotion classification)

David9591 opened this issue · 4 comments

In the case of emotion classification, the model file(46344KB) obtained through examples/face_classification/train.py training is much larger than the pre-training file "fer2013_mini_XCEPTION.119-0.65.hdf5" (848KB)given by you. What is the cause of this?Please forgive me for the simple question that I have raised because of my inadequate understanding of deep learning.

Best wishes!

It could be several things. One of them being that the model in train.py saves the complete state of the optimizer. How much different it is?

It could be several things. One of them being that the model in train.py saves the complete state of the optimizer. How much different it is?

When I choose the default, I do save all the information about the model, so I have 46452KB, but when I choose save_weights_only=True, I should only keep the weight data, but it still has 15567KB, but the official model is only 848KB, which is very strange, could you please explain it for me?Thank you very much!

It could be that they are actually different models or that the tensorflow hdf5 file weight format changed and now models are slightly bigger. One way to look into this would be to load both models and call model.summary() on both and observe the differences.

It could be that they are actually different models or that the tensorflow hdf5 file weight format changed and now models are slightly bigger. One way to look into this would be to load both models and call model.summary() on both and observe the differences.

Yes, you are right! After printing out the structure of the model, I found that there is indeed a big difference, and the difference in the number of model parameters can also be explained. I will continue to study it in depth. Thank you for your patient answer!