NEAT (NeuroEvolution of Augmenting Topologies) is a method developed by Kenneth O. Stanley for evolving arbitrary neural networks.
For further information regarding general concepts and theory, please see Selected Publications on Stanley's website
The training files can run with default arguments
If you adjust the downsampling or play a different game, you will need to adjust "num_inputs" in your config file
This will train using a single cpu core as well as render the OpenAI environment and downsampled network input.
For help:
python3 train.py -h
Format:
python3 train.py -c <checkpoint-filepath> -d <downscale rate> -g <game-env> -e <number-of-generations> -r <record (bool)> -s <state>
This will train using p number of threads. This will not do any rendering. For help:
python3 train-parallel.py -h
Format:
python3 train-parallel.py -c <checkpoint-filepath> -d <downscale rate> -g <game-env> -e <number-of-generations> -p <number of threads> -r <record (bool)> -s <state>
python3 train-parallel.py -d 16 -g SuperMarioWorld-Snes -r ./replays -p 12 -s YoshiIsland1
This will playback the winning genome (or any genome saved in the pickle format)
python3 playback.py -d <downscale rate> -g <game> -p <path-to-winner.pkl> -s <state>
The NEAT paper
- get size of input programmatically
- read parameter of "-r" to specify replay directory