/sbi

Simulation-based inference toolkit

Primary LanguagePythonGNU Affero General Public License v3.0AGPL-3.0

PyPI version Contributions welcome Tests codecov GitHub license DOI

sbi: simulation-based inference

Getting Started | Documentation

sbi is a PyTorch package for simulation-based inference. Simulation-based inference is the process of finding parameters of a simulator from observations.

sbi takes a Bayesian approach and returns a full posterior distribution over the parameters of the simulator, conditional on the observations. The package implements a variety of inference algorithms, including amortized and sequential methods. Amortized methods return a posterior that can be applied to many different observations without retraining; sequential methods focus the inference on one particular observation to be more simulation-efficient. See below for an overview of implemented methods.

sbi offers a simple interface for one-line posterior inference:

from sbi.inference import infer
# import your simulator, define your prior over the parameters
parameter_posterior = infer(simulator, prior, method='SNPE', num_simulations=100)

Installation

sbi requires Python 3.6 or higher. We recommend to use a conda virtual environment (Miniconda installation instructions). If conda is installed on the system, an environment for installing sbi can be created as follows:

# Create an environment for sbi (indicate Python 3.6 or higher); activate it
$ conda create -n sbi_env python=3.7 && conda activate sbi_env

Independent of whether you are using conda or not, sbi can be installed using pip:

pip install sbi

To test the installation, drop into a python prompt and run

from sbi.examples.minimal import simple
posterior = simple()
print(posterior)

Inference Algorithms

The following algorithms are currently available. You can find a tutorial on how to run each of these methods here.

Neural Posterior Estimation: amortized (NPE) and sequential (SNPE)

Neural Likelihood Estimation: amortized (NLE) and sequential (SNLE)

Neural Ratio Estimation: amortized (NRE) and sequential (SNRE)

Neural Variational Inference, amortized (NVI) and sequential (SNVI)

Mixed Neural Likelihood Estimation (MNLE)

Feedback and Contributions

We welcome any feedback on how sbi is working for your inference problems (see Discussions) and are happy to receive bug reports, pull requests and other feedback (see contribute). We wish to maintain a positive community, please read our Code of Conduct.

Acknowledgements

sbi is the successor (using PyTorch) of the delfi package. It was started as a fork of Conor M. Durkan's lfi. sbi runs as a community project; development is coordinated at the mackelab. See also credits.

Support

sbi has been supported by the German Federal Ministry of Education and Research (BMBF) through the project ADIMEM, FKZ 01IS18052 A-D). ADIMEM is a collaborative project between the groups of Jakob Macke (Uni Tübingen), Philipp Berens (Uni Tübingen), Philipp Hennig (Uni Tübingen) and Marcel Oberlaender (caesar Bonn) which aims to develop inference methods for mechanistic models.

License

Affero General Public License v3 (AGPLv3)

Citation

If you use sbi consider citing the sbi software paper, in addition to the original research articles describing the specific sbi-algorithm(s) you are using.

@article{tejero-cantero2020sbi,
  doi = {10.21105/joss.02505},
  url = {https://doi.org/10.21105/joss.02505},
  year = {2020},
  publisher = {The Open Journal},
  volume = {5},
  number = {52},
  pages = {2505},
  author = {Alvaro Tejero-Cantero and Jan Boelts and Michael Deistler and Jan-Matthis Lueckmann and Conor Durkan and Pedro J. Gonçalves and David S. Greenberg and Jakob H. Macke},
  title = {sbi: A toolkit for simulation-based inference},
  journal = {Journal of Open Source Software}
}

The above citation refers to the original version of the sbi project and has a persistent DOI. Additionally, new releases of sbi are citable via Zenodo, where we create a new DOI for every release.