multi-arm-bandits
There are 8 repositories under multi-arm-bandits topic.
SMPyBandits/SMPyBandits
🔬 Research Framework for Single and Multi-Players 🎰 Multi-Arms Bandits (MAB) Algorithms, implementing all the state-of-the-art algorithms for single-player (UCB, KL-UCB, Thompson...) and multi-player (MusicalChair, MEGA, rhoRand, MCTop/RandTopM etc).. Available on PyPI: https://pypi.org/project/SMPyBandits/ and documentation on
v-i-s-h/MAB.jl
A Julia Package for providing Multi Armed Bandit Experiments
crenwick/Swiper
🦊 A series of bandit algorithms in Swift
rudra2112/Warfarin-Dosage-Prediction
Warfarin Dosage Prediction using Linear-UCB Multi Arms Bandit
GuilongAaron/beta_distribution_adprediction
This program deploys Thompson Bandit algorithm to solve an ad prediction for highest probability of clicking.
SMPyBandits/SMPyBandits.github.io
Write-only repository that hosts the documentation for "Open-Source Python package for Single- and Multi-Players multi-armed Bandits algorithms" (SMPyBandits).