/bandits

Python library for Multi-Armed Bandits

Primary LanguageJupyter NotebookApache License 2.0Apache-2.0

Bandits

Python library for Multi-Armed Bandits

Implements the following algorithms:

  • Epsilon-Greedy
  • UCB1
  • Softmax
  • Thompson Sampling (Bayesian)
    • Bernoulli, Binomial <=> Beta Distributions

Installation

You can install bandits with:

git clone https://github.com/bgalbraith/bandits.git
cd bandits
pip install .

Examples

References

Wikipedia

Blog Posts

Presentations

Books and Book Chapters

Academic Articles

Software / Tools