Answer the question
In order to leave comments, you need to log in
Increase batch_size during network training?
I started training the CNN network with one batch_size, I realized that the GPU RAM allows you to double this number.
I want to stop training and start from the last checkpoint, but with a larger number of batch_size. Will I lose the learning progress I had before?
Answer the question
In order to leave comments, you need to log in
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question