Prototypical Network Clarification from Tutorial 16
AbhirKarande opened this issue · 2 comments
AbhirKarande commented
Have a few questions regarding how the classification works for the Prototypical Network:
- From a high level perspective, for input to the ProtoNet model, are both support set and the query point required, in order to get a corresponding label as output?
- If I wanted to just have an image (single query point) be the only input to a model that can then be compared to a non-changing support set (not part of the input), would this still be considered a prototypical network problem? Would a plausible workaround to this be passing a query point and a static support set as input to get a label (also static set)?
- Is the pre-trained ProtoNet file provided generalizable to other datasets? What makes it (prototypical networks in general) generalizable?
phlippe commented
Hi,
- The ProtoNet uses the support set to create the prototype feature vectors, with which the query is compared. In that sense, yes, both support set and query set are inputs to the model.
- If you fix the support set but still calculate the prototypes by encoding the support set with the same encoder as the queries, then it would still be considered a ProtoNet. Thus, if you take a trained ProtoNet and evaluate it on a new dataset with fixed classes, then it is still a ProtoNet due to the way it was trained. However, if you would remove the prototypes and instead make them learnable feature vectors independent of a support set, then it is not a ProtoNet anymore and actually becomes a standard classifier, since the last layer is equivalent to a linear layer.
- ProtoNets are more generalizable than normal classifiers to other datasets by adapting their classification head to new datasets. That being said, the model has been trained on CIFAR100. Thus, it will work better with datasets that are similar to CIFAR100, and performance decreases if you use datasets with very different statistics/characteristics.
AbhirKarande commented
Clarified! Thank you very much!