colgreen/sharpneat

Performance tune activation functions

colgreen opened this issue · 1 comments

See the recent performance tuning tweaks made to:

SharpNeat.NeuralNets.Double.ActivationFunctions.Vectorized.LeakyReLU

for hints.

Consider if any there are any per tuning opportunities for the scalar implementations - these are the ones that are currently used as they actually run faster for the neural net topologies that tend to get evolved, i.e, .highly irregular connectivity.

Done.