C
C
CeBePHblY2019-07-03 07:58:53
Python
CeBePHblY, 2019-07-03 07:58:53

Is it worth shuffling data to train a neural network during training?

Somewhere I briefly read that for faster convergence, it is desirable to shuffle the input data for the neural network during the learning process. Where can you read about it? Is the game worth the candle?
The second question is, if it is beneficial for the tasks set, then how to do it right?

So?

It was like this:
[0, 0, 1, 1]
[0, 1, 0, 1]
[1, 1, 0, 0]
Shuffle, it became like this:
[0, 1, 0, 1]
[1, 1, 0 , 0]
[0, 0, 1, 1
] shuffled the cases

Or so?

Было так:
[0, 0, 1, 1]
[0, 1, 0, 1]
[1, 1, 0, 0]
Тасуем, стало так:
[0, 1, 1, 0]
[1, 0, 1, 0]
[1, 0, 0, 1]
Т.е. перетасовали значения в каждом случае

Answer the question

In order to leave comments, you need to log in

1 answer(s)
A
Alexander Varakosov, 2019-07-03
@CeBePHblY

The point is that for each stage you make a random sample from the original dataset. This speeds up the process and avoids overfitting. Read "Deep learning. Dive into the world of neural networks".

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question