cfoh / Multi-Armed-Bandit-Example

Learning Multi-Armed Bandits by Examples. Currently covering MAB, UCB, Boltzmann Exploration, Thompson Sampling, Contextual MAB, Deep MAB.
25Updated last year

Related projects

Alternatives and complementary repositories for Multi-Armed-Bandit-Example