/pytorch-loss

label-smooth, amsoftmax, focal-loss, triplet-loss. Maybe useful

Primary LanguagePythonMIT LicenseMIT

pytorch-loss

My implementation of label-smooth, amsoftmax, focal-loss, dual-focal-loss, triplet-loss, giou-loss, affinity-loss, pc_softmax_cross_entropy, and dice-loss(both generalized soft dice loss and batch soft dice loss). Maybe this is useful in my future work.

Also tried to implement swish and mish activation functions.

Additionally, one-hot function is added.

Newly add an "Exponential Moving Average(EMA)" operator.

For those who happen to find this repo, if you see errors in my code, feel free to open an issue to correct me.