Python library for Multi-Armed Bandits
Implements the following algorithms:
- Epsilon-Greedy
- UCB1
- Softmax
- Thompson Sampling (Bayesian)
- Bernoulli, Binomial <=> Beta Distributions
- Bandits for Recommendation Systems
- Recommendations with Thompson Sampling
- Personalization with Contextual Bandits
- Bayesian Bandits - optimizing click throughs with statistics
- Mulit-Armed Bandits
- Bayesian Bandits
- Python Multi-armed Bandits (and Beer!)