Answer the question
In order to leave comments, you need to log in
Why can't a multilayer perceptron be used on small data samples?
Hello!
For my thesis, I was assigned to build a multilayer perceptron on a small sample. The data are presented in three monthly time series: income (D), expenditure (R), subsidy (S). There are 15 months for income and 17 months for expenditure and subsidies.
I want to build the following regression: predict income in the 16th and 17th month based on: Income in the last period lag_1(D), R, S.
I read Khaikin and articles on the net. Everyone stubbornly writes about the fact that a large sample is needed to adjust the network weights.
Question: why exactly? Why can't I take a multilayer perspron, make 3 inputs, 2 neurons on a hidden layer and 1 output, use a sigmoid as an activation function.
Experiment with speed and torque for the gradient, train the model at 13 months, changing the weights with each new iteration, and test the network at 14 and 15 months.
Thank you!
Answer the question
In order to leave comments, you need to log in
In real life, this would be inappropriate, but what is better for a diploma is up to you to judge.
For different models, there are different heuristics on how many degrees of freedom it is appropriate to use (for example, you can come across a recommendation not to use more parameters than the number of samples / 10). On the other hand, there is an opinion that proper regularization allows you to relax these requirements - see, for example , https://jakevdp.github.io/blog/2015/07/06/model-co...
З.Ы. Why sigmoid? As a minimum, another activation of the ReLU type will be needed at the output.
Yes, you can do everything, that's just whether this model suits you.
The more data, the less retraining. The data seems to be smeared within the network.
If there is little data, she will certainly learn something, but in other situations she will show results that are completely unsuitable for you.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question