`torch.nn.DataParallel` changes name of parameters
Closed this issue · 1 comments
mayank010698 commented
TPGM/DomainNet_ResNet_Exp/main_finetune.py
Line 127 in 82f0eb0
Hi,
in line
TPGM/DomainNet_ResNet_Exp/main_finetune.py
Line 108 in 82f0eb0
will change the name of the parameters from head.bias
to module.head.bias
. As a result, TPGM parameters will also be learned for final layer's weight and biases? Is this intentional?
PotatoTian commented
Hi,
Thanks for pointing this out. This is not intentional. The head weights are supposed to be excluded from training. Please change to the exclude_list=["module.head.weight", "module.head.bias"] because they are randomly initalized. I will also update it shortly.
Best,