/Thompson-Greedy-Comparison-for-MultiArmed-Bandits

Repository Containing Comparison of two methods for dealing with Exploration-Exploitation dilemma for MultiArmed Bandits

Primary LanguagePythonMIT LicenseMIT

Stargazers