jarbus/readable-ppo
An arguably-readable implementation of PPO, from Schulman et al's 2017 paper, [Proximal Policy Optimization Algorithms](https://arxiv.org/abs/1707.06347).
Julia
No issues in this repository yet.
An arguably-readable implementation of PPO, from Schulman et al's 2017 paper, [Proximal Policy Optimization Algorithms](https://arxiv.org/abs/1707.06347).
Julia
No issues in this repository yet.