/MRCpy

A Library for Minimax Risk Classifiers

Primary LanguagePythonGNU General Public License v3.0GPL-3.0

MRCpy: A Library for Minimax Risk Classifiers

Build Status Coverage Status

MRCpy implements recently proposed supervised classification techniques called minimax risk classifiers (MRCs). MRCs are based on robust risk minimization and can utilize 0-1 loss, in contrast to existing libraries using techniques based on empirical risk minimization and surrogate losses. Such techniques give rise to a manifold of classification methods that can provide tight bounds on the expected loss, enable efficient learning in high dimensions, and adapt to distribution shifts. MRCpy provides a unified interface for different variants of MRCs and follows the standards of popular Python libraries. This library also provides implementation for popular techniques that can be seen as MRCs such as L1-regularized logistic regression, zero-one adversarial, and maximum entropy machines.

Algorithms

Installation

Python 3.6

The latest built version of MRCpy can be installed using pip as

pip install MRCpy

Alternatively, the development version (GitHub) of MRCpy can be installed as follows

git clone https://github.com/MachineLearningBCAM/MRCpy.git
cd MRCpy
python3 setup.py install

NOTE: The solver based on CVXpy in the library uses GUROBI optimizer which requires a license. You can get a free academic license from here.

Dependencies

  • Python >= 3.8
  • numpy >= 1.18.1, scipy>= 1.4.1, scikit-learn >= 0.21.0, cvxpy, mosek, pandas

Usage

See the MRCpy documentation page for full documentation about installation, API, usage, and examples.

Citations

This repository is the official implementation of Minimax Risk Classifiers proposed in the following papers. If you use MRCpy in a scientific publication, we would appreciate citations to:

  • [1] [Mazuelas, S., Zanoni, A., & Pérez, A. (2020). Minimax Classification with 0-1 Loss and Performance Guarantees. Advances in Neural Information Processing Systems, 33, 302-312.] (https://arxiv.org/abs/2010.07964)

      @article{mazuelas2020minimax,
      title={Minimax Classification with 0-1 Loss and Performance Guarantees},
      author={Mazuelas, Santiago and Zanoni, Andrea and P{\'e}rez, Aritz},
      journal={Advances in Neural Information Processing Systems},
      volume={33},
      pages={302--312},
      year={2020}
      }
    
  • [2] [Mazuelas, S., Shen, Y., & Pérez, A. (2022). Generalized Maximum Entropy for Supervised Classification. IEEE Transactions on Information Theory, 68(4), 2530-2550.] (https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=9682746)

      @article{MazShePer:22,
              author = {Santiago Mazuelas and Yuan Shen and Aritz P\'{e}rez},
              title = {Generalized Maximum Entropy for Supervised Classification},
              journal={IEEE Transactions on Information Theory},
              volume = {68},
              number = {4},
              pages = {2530-2550},
              year={2022}
              }
    
  • [3] Bondugula, K. et al (2021). MRCpy: A Library for Minimax Risk Classifiers. arXiv preprint arXiv:2108.01952.

      @article{bondugula2021mrcpy,
      title={MRCpy: A Library for Minimax Risk Classifiers},
      author={Bondugula, Kartheek, and Alvarez, Veronica and Segovia-Mart{\'i}n J. I. and Mazuelas, Santiago and P{\'e}rez, Aritz},
      journal={arXiv preprint arXiv:2108.01952},
      year={2021}
      }
    

Updates and Discussion

You can subscribe to the MRCpy's mailing list for updates and discussion