nuclearboy95/Anomaly-Detection-PatchSVDD-PyTorch

Performances of some classes drop when training more epochs

chinchia opened this issue · 3 comments

I don't quite understand why the performances of Carpet, Grid, Tile, and Screw classes getting worse and worse during training. The best performances of them almost all appear in the first epoch, and the Grid class even lasts at AUROC=0.5 for both detection and segmentation.
Does anyone knows why would this happen?

image

I have the same problem

For grid class, try lambda=1e-3 and lr=1e-4.

When the lambda is too large, the loss tends to squash the feature embeddings because of the Lsvdd term.
In this case, you will get a trivial solution (model outputs a constant).

When you get AUROC=0.500 for both detection and segmentation, it means that the model collapsed to a trivial solution.
So AUROC=0.500 is an indication of "lambda is too large".

As long as AUROC != 0.500, simply low AUROC does not mean a trivial solution because the model is outputting something meaningful.
For some classes (screw, in particular, checkout the fig below) the performance starts under 0.5 and goes up as training proceeds, so you may want to train for more epochs.
image

However, when lambda=0 (only self-supervised learning is used), AUROC stays below 0.5 forever.
So I think that the Lsvdd is important, but when lambda is too large, it screws everything up.

Btw, I used lambda=1.0 and lr=1e-4 for the screw class. Lambda=0.3 ~ 1.0 seems to be the sweet spot for the screw class.

Hi author, thanks for your detailed explanation. I tried lambda=1e-3 for grid class and lambda=1.0 for screw class and they did worked!