/trustful-bandits

A two armed bandit simulation and comparison with theoritical convergence

Primary LanguageJupyter Notebook

Stargazers