/rlgym-ppo

A vectorized implementation of PPO for use with RLGym

Primary LanguagePythonApache License 2.0Apache-2.0

No issues in this repository yet.