submit-paper / Danzero_plus
☆42Updated last year
Alternatives and similar repositories for Danzero_plus:
Users that are interested in Danzero_plus are comparing it to the libraries listed below
- mcc_second_guandan☆79Updated 2 years ago
- [NeurIPS 2022] PerfectDou: Dominating DouDizhu with Perfect Information Distillation☆180Updated 11 months ago
- ☆44Updated 2 years ago
- ☆9Updated 2 years ago
- A Doudizhu reinforcement learning AI☆28Updated 4 months ago
- Douzero with ResNet and GPU support for Windows☆41Updated 3 years ago
- ☆32Updated 7 months ago
- Python Fan calculator for Chinese Standard Mahjong☆20Updated 3 months ago
- ☆40Updated last week
- TextStarCraft2,a pure language env which support llms play starcraft2☆264Updated 2 weeks ago
- 基于RLCard平台的麻将mahjong博弈游戏代码,包括基于规则和基于Dueling DQN的Agent模型。☆30Updated 3 years ago
- ☆70Updated last year
- Learning-based agent for Google Research Football (足球游戏智能体)☆111Updated 2 years ago
- A Simple, Distributed and Asynchronous Multi-Agent Reinforcement Learning Framework for Google Research Football AI.☆100Updated last year
- Reinforcement learning and planning for Minecraft.☆179Updated last year
- 使用alphazero算法打造属于你自己的象棋AI☆258Updated 2 years ago
- An unoffical implementation of AlphaHoldem. 1v1 nl-holdem AI.☆87Updated last year
- ☆21Updated 2 years ago
- Enabling Mixed Opponent Strategy Script and Self-play on SMAC☆28Updated last week
- ☆63Updated last year
- lecture32_AI挑战星际争霸II(强化学习)☆17Updated 2 years ago
- MahjongZero: DouZero for Mahjong | 麻将AI☆28Updated 2 years ago
- ☆165Updated last year
- ☆46Updated 3 months ago
- 强化学习训练斗地主 / doudizhu AI using reinforcement learning.☆15Updated 5 years ago
- A New Approach to Solving SMAC Task: Generating Decision Tree Code from Large Language Models☆39Updated last month
- Leaderboard and Visualization for RLCard☆380Updated last year
- An easier PyTorch deep reinforcement learning library.☆207Updated 4 months ago
- C++/python fight the lord with pybind11 (强化学习AI斗地主), Accepted to AIIDE-2020☆160Updated 3 years ago
- Yanglegeyang AI☆23Updated 2 years ago