S
S
Sergey Simonov2020-08-24 21:50:21
Machine learning
Sergey Simonov, 2020-08-24 21:50:21

Is the jumping error phenomenon normal when training a neural network and can the error be negative?

Tested a neural network with various model parameters: the number of full-connected layers, their size, the size of the training package. The network validates according to Loss, which decreased, but the error began to fluctuate with an increase in the amplitude of oscillations.
Hence, 2 questions:

  1. Is the phenomenon of "jumping error" from plus to minus normal in training?
  2. Do I understand correctly that in the end, the error should be positive?

5f440a5411835630459771.png

Answer the question

In order to leave comments, you need to log in

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question