ondrejbiza / bandits

Comparison of bandit algorithms from the Reinforcement Learning bible.
17Updated 6 years ago

Related projects

Alternatives and complementary repositories for bandits