Jingkang50/OpenOOD

Implementation of ASH seems incorrect.

Closed this issue · 3 comments

While going through the implementation of ASH, I noticed that the thresholding is not done input-dependent.

ASH zeros out a percentile of the lowest activations of each sample. In your implementation, it seems that the calculation of the threshold that zeros out the lowest percentile is missing.
Instead, only self.percentile is given as a threshold, which cuts the activations at a threshold between [0,1].

I hope I am not missing something in your code and understood it correctly. If so, I hope that I was able to help identify this bug.

I could see where this confusion comes from. When running ASH, the ASH-specific model wrapper will be applied,

elif postprocessor_name == 'ash':
net = ASHNet(net)

where the forward_threshold will call corresponding ASH processing function to compute the threshold according to percentile.
def forward_threshold(self, x, percentile):
_, feature = self.backbone(x, return_feature=True)
feature = ash_b(feature.view(feature.size(0), -1, 1, 1), percentile)
feature = feature.view(feature.size(0), -1)
logits_cls = self.backbone.get_fc_layer()(feature)
return logits_cls

You are probably looking at vanilla model's forward_threshold, which is used for ReAct.

Closing now. Feel free to reopen if you have more questions.

Thank you for the quick response! I indeed thought that forward_threshold, e.g. from resnet18_32x32 class, would be called. My bad!