Pretraining on ImageNet
tarun005 opened this issue · 1 comments
tarun005 commented
You had mentioned that the backbone network is ResNet-50 pretrained on Imagenet.
Universal-Domain-Adaptation/net.py
Line 37 in 5d7caa9
But in many experiments in the paper, the labels in the unsupervised target overlap with that of the supervised source labels in the ImageNet. Is it justified that you pretrain on supervised ImageNet labels?
youkaichao commented
That's a nice point. We use pre-trained models because they are so popular and are used by default. Maybe you can investigate how pre-trained models affect the performance :)