P
P
Pantuchi2020-10-28 08:54:23
Machine learning
Pantuchi, 2020-10-28 08:54:23

Backpropagation of an error. Why such a difference in the signs of the formula with the same approach?

I welcome everyone.
When studying neural networks and storming formulas on different sources, an ambiguous question arose with the backpropagation method. For example:
d - desired result y - network response w - weight g - gradient v - sigmoid value
somewhere they write that (1)[e = d - y], and somewhere (2)[y - d].
In weight adjustment (1)[w = w - learning rate * g * v] , and
where then (2) [w = w + learning rate * g * v]; So far, I have only found a regularity in the sign in adjusting the weights with the order of the variables in finding the error.

Answer the question

In order to leave comments, you need to log in

1 answer(s)
F
freeExec, 2020-10-28
@freeExec

Because this is not a strict formula, ala "how to find the subtrahend". Here it is important only in which direction it is necessary to "push" the network. And now how much everyone decides for himself in his particular case. And you are not yet using regulation of the first and second levels.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question