raffg / multi_armed_bandit

Monte Carlo simulations of several different multi-armed bandit algorithms and a comparison with classical statistical A/B testing
76Updated 5 years ago

Alternatives and similar repositories for multi_armed_bandit

Users that are interested in multi_armed_bandit are comparing it to the libraries listed below

Sorting: