/Multi-Armed-Bandits

Simple implementations of some algorithms for the multi-armed bandit problem and some plots to compare the algorithms

Primary LanguageJupyter Notebook

No issues in this repository yet.