GiorgosXou/NeuralNetworks

error: too many initializers for 'float (NeuralNetwork::Layer::* const [12])(const float&)'

GiorgosXou opened this issue · 0 comments

When using #define ACTIVATION__PER_LAYER without defining any function it automatically enablesALL_ACTIVATION_FUNCTIONS, accidently I forgot to change NUM_OF_USED_ACTIVATION_FUNCTIONS 12 to 14

#define NUM_OF_USED_ACTIVATION_FUNCTIONS 12