gdmarmerola / advanced-bandit-problemsLinks
More about the exploration-exploitation tradeoff with harder bandits
☆24Updated 6 years ago
Alternatives and similar repositories for advanced-bandit-problems
Users that are interested in advanced-bandit-problems are comparing it to the libraries listed below
Sorting:
- Big Data's open seminars: An Interactive Introduction to Reinforcement Learning☆64Updated 4 years ago
- Bandit algorithms simulations for online learning☆88Updated 5 years ago
- In this notebook several classes of multi-armed bandits are implemented. This includes epsilon greedy, UCB, Linear UCB (Contextual bandit…☆87Updated 4 years ago
- Multi-Armed Bandit Algorithms Library (MAB)☆133Updated 2 years ago
- (ICML2020) “Counterfactual Cross-Validation: Stable Model Selection Procedure for Causal Inference Models’’☆31Updated 2 years ago
- Library of contextual bandits algorithms☆334Updated last year
- Play with the solutions to the multi-armed-bandit problem.☆416Updated last year
- Contextual bandit in python☆114Updated 4 years ago
- Study NeuralUCB and regret analysis for contextual bandit with neural decision☆96Updated 3 years ago
- Bandit algorithms☆30Updated 7 years ago
- 🔬 Research Framework for Single and Multi-Players 🎰 Multi-Arms Bandits (MAB) Algorithms, implementing all the state-of-the-art algorith…☆407Updated last year
- Thompson Sampling Tutorial☆54Updated 6 years ago
- Multi-Armed Bandit algorithms applied to the MovieLens 20M dataset☆56Updated 5 years ago
- A lightweight contextual bandit & reinforcement learning library designed to be used in production Python services.☆66Updated 4 years ago
- Semi-synthetic experiments to test several approaches for off-policy evaluation and optimization of slate recommenders.☆43Updated 7 years ago
- ☆30Updated 5 years ago
- Contextual Bandit Algorithms (+Bandit Algorithms)☆22Updated 5 years ago
- A lightweight python library for bandit algorithms☆30Updated 3 years ago
- Linear UCB bandit learning algorithm L Li(2010) python code☆19Updated 10 years ago
- [IJAIT 2021] MABWiser: Contextual Multi-Armed Bandits Library☆251Updated 11 months ago