/bandits

Python library for Multi-Armed Bandits -Multifidelity extension

Primary LanguageJupyter NotebookApache License 2.0Apache-2.0

Bandits

Python library for Multi-Armed Bandits

Implements the following algorithms:

  • Epsilon-Greedy
  • UCB1
  • Softmax
  • Thompson Sampling (Bayesian)
    • Bernoulli, Binomial <=> Beta Distributions

Examples

References

Wikipedia

Blog Posts

Presentations

Books and Book Chapters

Academic Articles

Software / Tools