Axelrod-Python/Axelrod

Adding a new strategy gotchas

marcharper opened this issue · 2 comments

In PR #1364, adding a new strategy tripped some of the tests for the meta strategies because it induces a change in the behavior of those strategies. This is not ideal since it adds complexity to the process of adding new strategies.

There are various ways we could mitigate this:

  • Simply remove the tests -- they don't really test "expected behaviors" and basically just detect that something changed. We could still run a test of some sort to make sure nothing breaks, just not care about the output
  • Following the suggestion in #1353, as part of the testing / presubmit process (see #1360), we could detect these test changes and suggest fixes
  • Pin the strategies that go into the meta strategies, so their behavior doesn't actually change

At the least we should update the documentation. There are also documentation tests that tend to fail when adding a new strategy (counts of how many strategies there are, for example), but those seem to serve a somewhat useful purpose.

Gosh sorry I've just seen your comment on #1364, my bad for missing it.

Simply remove the tests -- they don't really test "expected behaviors" and basically just detect that something changed. We could still run a test of some sort to make sure nothing breaks, just not care about the output

I think I'm in favour of that approach. It's not ideal but is probably the most sensible on balance.

There are already some tests in TestPlayer that do exactly as described above, e.g. tests that a clone and the original produce the same play against a small set of opponents. So I guess we just remove the extra tests, which I did in #1373.