Answer the question
In order to leave comments, you need to log in
Activation function for multilayer neural network?
I solve the xor problem for a multilayer neural network with back propagation of an error. The output is two numbers a and b [equal to 0 or 1 ] and the input should get xor ( a , b ) I
use the sigmoid function. 1 / ( 1 + exp ( - x ) ) of
course I can't get an exact answer, only some kind of approximation.
Are there any functions that can exactly solve this problem?
If there is, then you can have an example of a function.
Answer the question
In order to leave comments, you need to log in
In your case, I think it would make sense to round to the nearest integer. But if you want to get a more accurate result, then there are a huge number of functions that you can try as an activation function for your neurons. In particular:
Linear transfer function
Threshold transfer function
Sigmoid transfer function
Logistic function (which you are currently using)
Hyperbolic tangent
Modified hyperbolic tangent
Radial basis transfer function
And many, many more functions
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question