Allow variable `num_classes` for pre-trained models
Closed this issue · 0 comments
PaulCCCCCCH commented
Currently, if we want to load pre-trained model weights into the pre-defined models, we have to set num_classes = 1000
to match the weight with the model architecture, otherwise an error is thrown. Now that we are using num_classes
from the dataset itself everywhere, pre-trained
option can never be turned on unless the dataset we load into Robustar has exactly 1000 classes.
PyTorch allows for loading weights partially, and we should be able to load pre-trained weights to all layers but the final output layer.