R
R
roman222752019-08-03 14:10:55
Machine learning
roman22275, 2019-08-03 14:10:55

Why does the neural network train on each training set, but when I try to train on all sets at the same time, the error does not decrease?

I solve the exclusive or problem. I don't use third party libraries.

Answer the question

In order to leave comments, you need to log in

1 answer(s)
I
Ilya Neizvestnyj, 2019-10-09
@Cheloved

In general, if you build a graph, you can see that at some point the network stops learning and the retraining effect occurs.
It is impossible to overcome overfitting completely, but you can make it happen later.
1) Use regularization (L1, L2)
2) Dropout (decimation)
3) If you work with images, then you can flip them in the generator, create a white pixel, etc.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question