/Multi-armed-bandit-algortihms

Implementation of famous Bandits algortihm: Explore then commit, UCB & Thompson sampling in python.

Primary LanguageJupyter Notebook