This code repository is provided for the member of SML-Group led by Prof.Tongliang Liu. Its main topic is Learning with Label Noise. It includes the following:
- Commonly used datasets and how to generate label noise on synthetic experiments.
- Important baseline.
- Synthetic Datasets: MNIST, CIFAR10/100, SVHN, Fashion-MNIST.
- Real-world Datasets: Imagenet, Webvision, Clothing1M,Food101.
In this section, we consider two kinds of label noise: Class-dependent label-noise and Instance-dependent label-noise.
We corrupted the training and validation sets manually according to true transition matrices T. (See details in utils.py) The flipping setting includes Symmetry Flipping and Pair Flipping. You can use noise rate parameter to control flip rate, use random seed parameter to control different noisy label generation and use split parameter to control the ratio of training and validation set.
Coming soon.
- Cross entropy loss function. It is worth mentioning that PyTorch merged log_softmax and nll_loss to serve as cross entropy loss function.
- Forward
- Backward
- Reweight
- T_revision
- Decoupling
- MentorNet
- Co-teaching
- Co-teaching Plus
- D2L
- Symmetric Loss
- Deep Self-Learning
- L_DMI
- Co-Regularization
- DAC
- GCE
- GLC