/ppo-mujoco

A minimal codebase for PPO training on MuJoCo environments with some customization supports.

Primary LanguagePythonMIT LicenseMIT

Stargazers