/mappo

This is the official implementation of Multi-Agent PPO.

Primary LanguagePythonMIT LicenseMIT

Stargazers