multi-armed
There are 2 repositories under multi-armed topic.
Nth-iteration-labs/contextual
Contextual Bandits in R - simulation and evaluation of Multi-Armed Bandit Policies
Nth-iteration-labs/streamingbandit
Python application to setup and run streaming (contextual) bandit experiments.