/ltu-ili

Robust ML in Astro

Primary LanguageJupyter Notebook

LtU-ILI

All Contributors unittest codecov docs

The Learning the Universe Implicit Likelihood Inference (LtU-ILI) pipeline is an all-in-one framework for performing machine learning parameter inference in astrophysics and cosmology. Given labeled training data ${(x_i,\theta_i)}_{i=1}^N$ or a stochastic simulator $x(\theta)$, LtU-ILI is designed to automatically train state-of-the-art neural networks to learn the data-parameter relationship and produce robust, well-calibrated posterior inference.

The pipeline is quick and easy to set up; here's an example of training a Masked Autoregressive Flow (MAF) network to predict a posterior over parameters $y$, given input data $x$:

...  # Imports

X, Y = load_data()                              # Load training data and parameters
loader = ili.data.NumpyLoader(X, Y)             # Create a data loader

trainer = ili.inference.InferenceRunner.load(
  backend = 'sbi', engine='NPE',                # Choose a backend and inference engine (here, Neural Posterior Estimation)
  prior = ili.utils.Uniform(low=-1, high=1),    # Define a prior 
  # Define a neural network architecture (here, MAF)
  nets = [ili.utils.load_nde_sbi(engine='NPE', model='maf')]  
)

posterior, _ = trainer(loader)                  # Run training to map data -> parameters

samples = posterior.sample(                     # Generate 1000 samples from the posterior for input x[0]
  x=X[0], sample_shape=(1000,)
)

Beyond this simple example, LtU-ILI comes with a wide range of customizable complexity, including:

  • Posterior-, Likelihood-, and Ratio-Estimation methods for ILI, including Sequential learning analogs
  • Various neural density estimators (Mixture Density Networks, Conditional Normalizing Flows, ResNet-like ratio classifiers)
  • Fully-customizable, exotic embedding networks (including CNNs and Graph Neural Networks)
  • A unified interface for multiple ILI backends (sbi, pydelfi, and lampe)
  • Multiple marginal and multivariate posterior coverage metrics
  • Jupyter and command-line interfaces
  • A parallelizable configuration framework for efficient hyperparameter tuning and production runs

For more details on the motivation, design, and theoretical background of this project, see the software release paper.

Getting Started

To install LtU-ILI, follow the instructions in INSTALL.md.

To get started, try out the tutorial for the Jupyter notebook interface in notebooks/tutorial.ipynb or the command line interface in examples/.

API Documentation

The documentation for this project can be found at this link.

References

We keep an updated repository of relevant interesting papers and resources at this link.

Contributing

Before contributing, please familiarize yourself with the contribution workflow described in CONTRIBUTING.md.

Contact

If you have comments, questions, or feedback, please write us an issue. The current leads of the Learning the Universe ILI working group are Benjamin Wandelt (benwandelt@gmail.com) and Matthew Ho (matthew.annam.ho@gmail.com).

Contributors

Matt Ho
Matt Ho

💻 🎨 💡 📖 👀 🚇 🖋 🔬
Deaglan Bartlett
Deaglan Bartlett

💻 🎨 💡 📖 👀 🚇 🖋 🔬
Nicolas Chartier
Nicolas Chartier

💡 📖 🔬 💻 🎨 👀 🖋
Carolina Cuesta
Carolina Cuesta

💻 🎨 💡 📖 👀 🔬
Simon
Simon

💻 💡
Axel Lapel
Axel Lapel

💻 🔬 💡
Pablo Lemos
Pablo Lemos

🎨 💻
Chris Lovell
Chris Lovell

🔬 💡 🔣 🖋
T. Lucas Makinen
T. Lucas Makinen

💻 🔬
Chirag Modi
Chirag Modi

🎨 💻
Shivam Pandey
Shivam Pandey

🔬 💡
L.A. Perez
L.A. Perez

🔬 🖋

Acknowledgements

This work is supported by the Simons Foundation through the Simons Collaboration on Learning the Universe.