Batch Normalization for Neural Networks - Deep Learning Dictionary
Batch Normalization - Deep Learning Dictionary
In non-normalized datasets, the larger data points can cause instability in neural networks. These relatively large inputs can cascade through the layers in the network, which may cause imbalanced gradients during the training process, potentially leading to the exploding gradient problem.
Committed by on