Answer the question
In order to leave comments, you need to log in
Why are RBMs not popular?
I got the impression that at conferences the lion's share of reports are about CNN, RNN and autoencoders. And they don’t say anything new about RBM. What happened?
Answer the question
In order to leave comments, you need to log in
I found the answer myself: because the main use case of RBM was the initialization of network weights through pre-training in order to eliminate the problem of gradient degradation. But since then, more efficient ways to solve this problem have been found. In particular, glorot initialization and ReLU.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question