johnmyleswhite / BanditsBookView external linksLinks
Code for my book on Multi-Armed Bandit Algorithms
☆920Jan 9, 2020Updated 6 years ago
Alternatives and similar repositories for BanditsBook
Users that are interested in BanditsBook are comparing it to the libraries listed below
Sorting:
- Python library for Multi-Armed Bandits☆766Feb 11, 2020Updated 6 years ago
- Library of contextual bandits algorithms☆339Mar 14, 2024Updated last year
- Python implementations of contextual bandits algorithms☆820Jan 14, 2026Updated last month
- A JavaScript demo of some multi-armed bandits algorithms☆39Aug 23, 2020Updated 5 years ago
- R package for split test/one-armed bandit analysis☆16May 5, 2014Updated 11 years ago
- Multi-armed bandit simulation library☆141Nov 9, 2023Updated 2 years ago
- 🔬 Research Framework for Single and Multi-Players 🎰 Multi-Arms Bandits (MAB) Algorithms, implementing all the state-of-the-art algorith…☆418Apr 30, 2024Updated last year
- Contextual bandit in python☆112Jul 7, 2021Updated 4 years ago
- An implementation of the multi-armed bandit optimization pattern as a Flask extension