/ppo

PyTorch implementation of Proximal Policy Optimization (PPO) (Schulman et al., 2017)

Primary LanguagePythonMIT LicenseMIT

Watchers