niravnb/Multi-armed-bandit-algortihms
Implementation of famous Bandits algortihm: Explore then commit, UCB & Thompson sampling in python.
Jupyter Notebook
Implementation of famous Bandits algortihm: Explore then commit, UCB & Thompson sampling in python.
Jupyter Notebook