/bandit

A library implementing several semi-uniform strategies to tackle the multi-armed bandit problem

Primary LanguageMATLAB

Stargazers

No one’s star this repository yet.