/multi-agent-PPO-on-SMAC

Implementations of MAPPO and IPPO on SMAC, the multi-agent StarCraft environment.

Primary LanguagePython

Stargazers