skaraoglu/Multi-armed-bandits
Multi-armed bandits experiment with epsilon-greedy, UCB, Thompson sampling, Bayesian-greedy and HA-UCB
Jupyter Notebook
No issues in this repository yet.
Multi-armed bandits experiment with epsilon-greedy, UCB, Thompson sampling, Bayesian-greedy and HA-UCB
Jupyter Notebook
No issues in this repository yet.