/BanditEmpirical

Empirical tests of various bandit algorithms.

Primary LanguageHTML

Empirical Evaluation of Bandit Algorithms

Using Yahoo! Webscope TODAY article click data to test a variety of bandit algorithms.

Currently expect to test:

  • Epsilon-greedy
  • UCB (context-less)
  • UCB (indexed)
  • GLM-UCB
  • Thompson Sampling