Some questions about num-classes and class_weights
Xin-hhh opened this issue · 3 comments
(1)I Would like to ask why the value of num_classes=1 instead of 2?
(2) train_counters = collections.Counter(image[1] for image in train_dataset)
class_weights = train_counters[0] / train_counters[1]
loss_fn = torch.nn.BCEWithLogitsLoss(pos_weight=torch.tensor([class_weights]))
In the case where the number of frames in each video is fixed and the same, is the class_weights=1?
I have a weak foundation, thank you very much for your answer。
Hi Xin, it is quite common to use num_classes=1 in the case of binary classification. It is used in combination with loss functions like the one we used (BCEWithLogitsLoss). Simply the network will have only one output neuron as there is only one class instead of two and it will be the first class if the value is below a certain threshold or the second class if it is above it. It would have been possible to train with another loss and two output classes which, via softmax, would then result in two distinct values and the higher one would be chosen.
Class weights are calculated dynamically on the basis of the training samples for each class, the BCEWithLogitsLoss provides this usage which you see in the code considering as argument a list with a single value. This value will be higher than 1 if you have more samples of class 0, otherwise, it will be between 0 and 1.
The number of considered frames is fixed inside the config files. The only case in which you can have class_weights equal to 1 is when you have the exact same number of images from class 0 and class 1.
I got it, thanks a lot for your answer.
You're welcome!