/MultiArmedBandits

Python implementation of Multi armed bandits, with agent classes and arms for rapid experimentation. Mostly fun!

Primary LanguageHTMLGNU General Public License v3.0GPL-3.0

Stargazers