openai/multiagent-particle-envs

Checking collisions with agent itself?

tessavdheiden opened this issue · 3 comments

Hi!

Is it true that in simple_spread.py collisions are checked with the agents themselves:
if agent.collide:
for a in world.agents:
if self.is_collision(a, agent):
line:

Maybe do something like:
if agent.collide:
for a in world.agents:
if agent == a: continue
if self.is_collision(a, agent):

Have you noticed any improvement in average rewards per episode due to this change? I am curious.

Hi!

No, because it will just add a constant negative reward everytime, so the gradient is not affected ;-)

Swarm intelligence on reinforcement learning for more than 50 agents without collision:
https://github.com/Edision-liu/Reinforcement-learning-on-MAPE