DirtyHarryLYL/HAKE-Action-Torch

Problems about loss_reweight function.

quan1e opened this issue · 2 comments

Dear authors:
Sorry to bother you again.
As described in the codes and other Issues, the precomputed loss weights is stored in file loss_weights.npy, which were generated from the statistics of our HAKE dataset.
Does it mean that, loss_weights.npy just stores the probabilities of each part and verb, then in the training process the learned weights will be reweighted by the ground truth weight ?
And, i want to know if the weights in loss_weights.npy will be updated in the training process?
Can this precomputed weight improve the network performance? Is it a necessary step in the project?
Thanks.

hwfan commented

loss_weights.npy includes the weights of each pasta and verb class. The weights are computed by the formula:

w = k*log(k/p),

where p is the probability of the pasta/verb class. In practice, the value of k is defined in the config TRAIN.LOSS_WEIGHT_K, and the default setting of k is 2. The weights are frozen during the training procedure and won't be updated.

In our experiment, the introduction of loss weights can force the network to learn from the few-shot classes, which can improve the total performance on the HAKE test set. It is not a necessary step, you can remove it by setting TRAIN.WITH_LOSS_WTS as False.

Thanks. I would remove it when training my own dataset.