/bandits

Comparison of bandit algorithms from the Reinforcement Learning bible.

Primary LanguagePythonMIT LicenseMIT

Watchers