P
P
Prizm2020-04-06 01:19:57
Mathematics
Prizm, 2020-04-06 01:19:57

What is the analogue of Newton's method for multidimensional space?

There is Newton's method for finding the zero of a function: x -> xy/y' and the gradient descent method for finding the minimum of a function: x -> x - a*gradx.
Is it possible to "cross" them and get a more optimal version of gradient descent, for example, by the formula x -> x - a*y*gradx/(|gradx|^2)? If yes, how to do it right? And if not, why not?

Answer the question

In order to leave comments, you need to log in

2 answer(s)
P
Prizm, 2021-05-10
@PrizmMARgh

After some time, I found out that everything described above is a special case of optimizers (I forgot to clarify in the question that all this was necessary to train the neural network with gradient descent), for example, the adam optimizer. They are the easiest to use.

I
Ilya, 2020-05-04
@illaaa

I don't quite understand what the idea is. Or rather, what result do you want to get?
Newton's method finds the zero of a function - just the point where the function will be zero.
The gradient descent method finds the minimum of a function (not always global).
The zero of a function does not have to be its minimum.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question