Re-implementing the original code for Jax/Haiku. Jax is considered to be the fastest deep learning framework at this time. Reimplementing OOD algorithms has the benefit of faster training/inference, plus, the implementation is personally a way for me to learn in detail how the algorithms are implemented.
Further improvements to OOD approaches will be branched using this code as the base.
-- Anique
Source code for the paper:
@article{InvariantRiskMinimization,
title={Invariant Risk Minimization},
author={Arjovsky, Martin and Bottou, L{\'e}on and Gulrajani, Ishaan and Lopez-Paz, David},
journal={arXiv},
year={2019}
}
Repository licensed under LICENSE.