P
P
Pantuchi2020-10-22 16:23:30
Machine learning
Pantuchi, 2020-10-22 16:23:30

Neural network training, why can the percentage of recognitions fall?

All who decided to go welcome.
I recently wrote the first modest neural network for handwritten digit recognition. I ran it through the mine, got the result 86% recognized. Then I decided to write different numbers myself and teach them with my own examples. The grid recognized was trained. Then I decided to once again check the percentage of recognition in the MNIST database and now 74%!?!?!!? Why the percentage fell I did not understand.
My grid has input[28*28 + 1(bise)], hidden[30 + 1(bise)], output[10]. The activation function is sigmoid. Initial weights from -0.5 to 0.5 excluded 0 at the weights initialization stage.
I convert my images almost 1 : 1 with MNIST (digit size and padding + slight centering of the digit)
PS: Backpropagation is present, weights are corrected.

Answer the question

In order to leave comments, you need to log in

1 answer(s)
X
xmoonlight, 2020-10-22
@xmoonlight

Lack of signal balancing.
Two metrics are still forgotten (except for "recognized" or "forward"): reverse/reverse (output->input) and error.
In general, this is how it should be solved: "direct" + "reverse" + "error" => 0.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question