A
A
AlexBoss2019-03-05 21:24:57
Python
AlexBoss, 2019-03-05 21:24:57

Why multiple layers in tensorflow and how do they work?

Good afternoon. When solving the same problem, I encounter the following variations:
1) model.add(keras.layers.Dense(800, activation='relu'))
2)model.add(keras.layers.Dense(500, activation='relu') '))
model.add(keras.layers.Dense(150, activation='relu'))
What is the meaning of the second layer in this case, and, if it's not difficult, please explain how it works.
I understand that the neurons of the first layer are connected to every second layer and that this improves the quality of the network, but why not make it easier in this case to make one scale layer?

Answer the question

In order to leave comments, you need to log in

3 answer(s)
O
origami1024, 2019-03-05
@AlexBoss

You can't explain it on your fingers, it's not a proven question. Theoretically, any function can be built and trained on one layer if there are enough neurons there. But an empirically more complex function can be built and trained more easily on 6 neurons in two layers 3-3 than 6 neurons in one layer.

T
TriKrista, 2019-03-05
@TriKrista

IMHO, if on the fingers, in the general case, each layer can be replaced by one curve on the graph, for example, to separate pears from apples, one curve is enough, and that one layer, and to solve the "OR" problem, two curves are needed, in total two layers .
Also, in the general case, to solve the same problem, a multi-layer network will require fewer synapses than a single-layer one.

I
ivodopyanov, 2019-03-06
@ivodopyanov

https://playground.tensorflow.org/
Here you can play around with the number of layers and neurons in a trivial network and see how these parameters affect the expressiveness of the model.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question