Wrong derivative of leaky-relu activation
Opened this issue · 1 comments
sjh11556 commented
According to mod_activation.F90 file, It seems like derivative function of leaky_relu is same as RELU.
pure function leaky_relu_prime(x, alpha) result(res)
! First derivative of the REctified Linear Unit (RELU) activation function.
real(rk), intent(in) :: x(:)
real(rk), intent(in) :: alpha
real(rk) :: res(size(x))
where (0.3 * x > 0)
res = 1
elsewhere
res = 0
end where
end function leaky_relu_prime
milancurcic commented
FWIW neural-fortran-0.12.0 implements leaky ReLU.