X
X
Xvir432019-11-12 15:28:42
Neural networks
Xvir43, 2019-11-12 15:28:42

Need help with L-2 regularization?

There is a backpropagation neural network with a gradient descent function, the question is at what point is regularization done? I understand that regularization is applied at the stage of calculating the inverse error, but where?? when we adjust the weights?

Answer the question

In order to leave comments, you need to log in

1 answer(s)
F
freeExec, 2019-11-12
@Xvir43

Roughly speaking

newWeight = weight * (L1 + L2 * weight + grad) / batchSize

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question