Dropout Regularization for Neural Networks - Deep Learning Dictionary
Dropout Regularization - Deep Learning Dictionary
Generally, regularization is any technique used to modify the model, or the learning algorithm in general, in attempts to increase its ability to generalize better without the expense of increasing the training loss.
Dropout is a popular regularization technique that randomly ignores or drops out nodes and their corresponding weights in specified layers of a network during a single iteration of the training process.
Committed by on