Answer the question
In order to leave comments, you need to log in
Why does the wrong derivative work better?
Hello! Write a multilayer perceptron with backpropagation according to this tutorial
It uses the logistic function and its derivative
def nonlin(self, x, deriv=False):
if (deriv == True):
return x * (1 - x)
return 1 / (1 + np.exp(-x))
Answer the question
In order to leave comments, you need to log in
Because when calculating the derivative, the nonlin input is not the input of this layer, but the previously calculated output.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question