junkwhinger / PPO_PyTorch

This repo contains PPO implementation in PyTorch for LunarLander-v2
10Updated 4 years ago

Alternatives and similar repositories for PPO_PyTorch:

Users that are interested in PPO_PyTorch are comparing it to the libraries listed below