T
T
Tamul2020-12-08 19:04:22
Neural networks
Tamul, 2020-12-08 19:04:22

Why is the softsign activation function unpopular?

On Wikipedia , in an article about the activation functions of artificial neurons, I found a function called Softsign, which is simply calculated as f (x) = x / (1 + | x |) and has a range of values ​​​​from -1 to 1 with zero-crossing coordinates. In addition to the Wikipedia article, in other places I found literally two or three references to it, and even then not about the results of real work.
Actually, why is it so inappropriate that it is not shoved into every article about neural networks? About much rarer (and much more strange) functions, a lot of information about successful and unsuccessful use, but nothing here.
One of the articles :
softsign.png?resize=371%2C264&ssl=1

Answer the question

In order to leave comments, you need to log in

1 answer(s)
S
Sergey, 2020-12-09
@begemot_sun

Unpopularity comes from the fact that it is a broken function. And the derivative of it at point 0 is not defined.
Now, for training the NN, it is customary to use the backpropagation method, which uses the derivative to adjust the weights. Accordingly, this type of function creates problems with calculations.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question