Implement PPO algorithm on mujoco environment,such as Ant-v2, Humanoid-v2, Hopper-v2, Halfcheeth-v2.
- python 3.7.6
- gym 0.17.6
- mujoco_py 2.0.2.10
- pytorch
$ python main.py --env_name Hopper-v2
Implement PPO algorithm on mujoco environment,such as Ant-v2, Humanoid-v2, Hopper-v2, Halfcheeth-v2.
Python