/on-policy

This is the official implementation of Multi-Agent PPO (MAPPO).

Primary LanguagePythonMIT LicenseMIT

Stargazers