Logo by Zhuoning Yuan
Website | Updates | Installation | Tutorial | Research | Github
We continuously update our library by making improvements and adding new features. If you use or like our library, please star⭐ this repo. Thank you!
X-risk refers to a family of compositional measures/losses, in which each data point is compared with a set of data points explicitly or implicitly for defining a risk function. It covers a family of widely used measures/losses, which can be organized into four interconnected categories:
- Areas under the curves, including areas under ROC curves (AUROC), areas under Precision-Recall curves (AUPRC), one-way and two-wary partial areas under ROC curves.
- Ranking measures/objectives, including p-norm push for bipartite ranking, listwise losses for learning to rank (e.g., listNet), mean average precision (mAP), normalized discounted cumulative gain (NDCG), etc.
- Performance at the top, including top push, top-K variants of mAP and NDCG, Recall at top K positions (Rec@K), Precision at a certain Recall level (Prec@Rec), etc.
- Contrastive objectives, including supervised contrastive objectives (e.g., NCA), and global self-supervised contrastive objectives improving upon SimCLR and CLIP.
- Easy Installation - Easy to install and insert LibAUC code into existing training pipeline with Deep Learning frameworks like PyTorch.
- Broad Applications - Users can learn different neural network structures (e.g., MLP, CNN, GNN, transformer, etc) that support their data types.
- Efficient Algorithms - Stochastic algorithms with provable theoretical convergence that support learning with millions of data points without larger batch size.
- Hands-on Tutorials - Hands-on tutorials are provided for optimizing a variety of measures and objectives belonging to the family of X-risks.
$ pip install libauc==1.2.0rc0
The latest version 1.2.0rc0
will be updated soon! You can also download source code for previous version here.
>>> #import our loss and optimizer
>>> from libauc.losses import AUCMLoss
>>> from libauc.optimizers import PESG
...
>>> #define loss & optimizer
>>> Loss = AUCMLoss()
>>> optimizer = PESG()
...
>>> #training
>>> model.train()
>>> for data, targets in trainloader:
>>> data, targets = data.cuda(), targets.cuda()
logits = model(data)
preds = torch.sigmoid(logits)
loss = Loss(preds, targets)
optimizer.zero_grad()
loss.backward()
optimizer.step()
...
>>> #update internal parameters
>>> optimizer.update_regularizer()
- AUROC: Optimizing AUROC loss on imbalanced dataset
- AUPRC: Optimizing AUPRC loss on imbalanced dataset
- Partial AUROC: Optimizing Partial AUC loss on imbalanced dataset
- Compositional AUROC: Optimizing Compositional AUROC loss on imbalanced dataset
- NDCG: Optimizing NDCG loss on MovieLens 20M
- SogCLR: Optimizing contrastive loss using small batch size on ImageNet-1K
- A Tutorial of Imbalanced Data Sampler
- Constructing benchmark imbalanced datasets for CIFAR10, CIFAR100, CATvsDOG, STL10
- Using LibAUC with PyTorch learning rate scheduler
- Optimizing AUROC loss on Chest X-Ray dataset (CheXpert)
- Optimizing AUROC loss on Skin Cancer dataset (Melanoma)
- Optimizing AUROC loss on Molecular Graph dataset (OGB-Molhiv)
- Optimizing multi-task AUROC loss on Chest X-Ray dataset (CheXpert)
- Optimizing AUROC loss on Tabular dataset (Credit Fraud)
- Optimizing AUROC loss for Federated Learning
If you find LibAUC useful in your work, please acknowledge our library and cite the following papers:
@misc{libauc2022,
title={LibAUC: A Deep Learning Library for X-Risk Optimization.},
author={Zhuoning Yuan, Zi-Hao Qiu, Gang Li, Dixian Zhu, Zhishuai Guo, Quanqi Hu, Bokun Wang, Qi Qi, Yongjian Zhong, Tianbao Yang},
year={2022}
}
@article{dox22,
title={Algorithmic Foundation of Deep X-risk Optimization},
author={Tianbao Yang},
journal={CoRR},
year={2022}
For any technical questions, please open a new issue in the Github. If you have any other questions, please contact us @ Zhuoning Yuan [yzhuoning@gmail.com] and Tianbao Yang [tianbao-yang@uiowa.edu].