qingshi9974 / PPO-pytorch-Mujoco

Implement PPO algorithm on mujoco environment,such as Ant-v2, Humanoid-v2, Hopper-v2, Halfcheeth-v2.
49Updated 4 years ago

Alternatives and similar repositories for PPO-pytorch-Mujoco:

Users that are interested in PPO-pytorch-Mujoco are comparing it to the libraries listed below