neurosim/MLP_NeuroSim_V3.0

change activation function but return same accuracy value!!!

Opened this issue · 4 comments

Hello, when I change the activation function inside the formula.cpp file (for example, when I write the formula for the tanh function), it consistently returns an accuracy value of 9.80%. Do you know why this might be happening?

Screenshot from 2023-12-17 17-46-44
image

Thank you for help

Hi,

Disclaimer it's been a few year since I've looked at this code. It seems that the gradient for the activation is hardcoded within train.cpp. You might need to update their backpropagation code if you want to change the activation.

// Backpropagation

Hello, first of all, thank you for your reply. I changed the part you said as follows, but after a certain iteration, it gets stuck at 10.35%. (I want to use tanh as activation function)

image

// Backpropagation
/* Second layer (hidden layer to the output layer) */

		for (int j = 0; j < param->nOutput; j++){
                             s2[j] = -2*a2[j] * (1 - a2[j]*a2[j])*(Output[i][j] - a2[j]);
		}

		/* First layer (input layer to the hidden layer) */
		std::fill_n(s1, param->nHide, 0);
		#pragma omp parallel for
					
		for (int j = 0; j < param->nHide; j++) {
			for (int k = 0; k < param->nOutput; k++) {
				s1[j] += a1[j] * (1 - a1[j]*a1[j]) * weight2[k][j] * s2[k];
			}
		}