/PPO-pytorch-Mujoco

Implement PPO algorithm on mujoco environment,such as Ant-v2, Humanoid-v2, Hopper-v2, Halfcheeth-v2.

Primary LanguagePython

PPO-pytorch-Mujoco

Implement PPO algorithm on mujoco environment,such as Ant-v2, Humanoid-v2, Hopper-v2, Halfcheeth-v2.

Requirements

  • python 3.7.6
  • gym 0.17.6
  • mujoco_py 2.0.2.10
  • pytorch

Usage

$ python main.py --env_name Hopper-v2

Results

Hopper-v2

image

Humanoid-v2

image

Halfcheetah-v2

image

Ant-v2

image