D
D
Daniel2020-03-23 10:07:41
Neural networks
Daniel, 2020-03-23 10:07:41

Does the neural network give a different answer for different initial weights?

What can be said in a multilayer parceptron using the backpropagation method, when the result of running a working example fluctuates quite a lot. And only the random weights change. But after all, in theory, they should strive for some values ​​after a hundred epochs.
Found the Perceptron Convergence Theorem. I have been analyzing the implementation for 2 days, what could be the reasons.

Answer the question

In order to leave comments, you need to log in

1 answer(s)
F
freeExec, 2020-03-23
@freeExec

Yes, you find yourself in different local minima. You can play around with the parameters L1, L2, or choose a different optimizer for gradient descent.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question