Answer the question
In order to leave comments, you need to log in
0% successful classification on the training set when training DBN - is this normal?
I'm trying to train a deep belief network (4 layers, 500 neurons -> 500 neurons -> 500 neurons -> 250 neurons)
The first stage - RBM training - seems to be going well. After 20 iterations, the errors do not exceed 0.005
. I start "finishing" with the help of backprop - I get 0% of successful classifications and a big error. They, of course, change for the better in the course of training, but is it normal? From the lectures on which I understood the algorithm, I got the impression that the task of the first stage is precisely to select the weights so that the minimum is not far away.
Answer the question
In order to leave comments, you need to log in
I found my own mistake. I used the darch library and did as they did in their example. And there, the sigmoid function was used as an activation function for backpropagation. And in my case it was necessary softmax.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question