/rlberry

An easy-to-use reinforcement learning library for research and education.

Primary LanguagePythonMIT LicenseMIT

A Reinforcement Learning Library for Research and Education

pytest Documentation Status contributors Codacy codecov

PyPI PyPI - Python Version PyPI - Wheel PyPI - Status PyPI - Downloads

Try it on Google Colab! Open In Colab


What is rlberry?

Writing reinforcement learning algorithms is fun! But after the fun, we have lots of boring things to implement: run our agents in parallel, average and plot results, optimize hyperparameters, compare to baselines, create tricky environments etc etc!

rlberry is a Python library that makes your life easier by doing all these things with a few lines of code, so that you can spend most of your time developing agents. rlberry also provides implementations of several RL agents, benchmark environments and many other useful tools.

Check our documentation and our getting started section!

Getting started

We provide a handful of notebooks on Google colab as examples to show you how to use rlberry.

Content Description Link
Introduction to rlberry How to create an agent, optimize its hyperparameters and compare to a baseline. Open In Colab
RL Experimental Pipeline How to define a configuration, run experiments in parallel and save a config.json for reproducibility. Open In Colab

Citing rlberry

If you use rlberry in scientific publications, we would appreciate citations using the following Bibtex entry:

@misc{rlberry,
author = {Domingues, Omar Darwiche and Flet-Berliac, Yannis and Leurent, Edouard and M{\'e}nard, Pierre and Shang, Xuedong and Valko, Michal},
title = {{rlberry - A Reinforcement Learning Library for Research and Education}},
year = {2021},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/rlberry-py/rlberry}}
}

Tests

To run tests, install test dependencies with pip install -e .[test] and run pytest.

To check coverage, install test dependencies and run

$ cd scripts
$ bash run_testscov.sh

and coverage report in cov_html/index.html.

Contributing

Want to contribute to rlberry? Please check our contribution guidelines. A list of interesting TODO's will be available soon. If you want to add any new agents or environments, do not hesitate to open an issue!