Machine Learning & Deep Learning Fundamentals

with deeplizard.

Vanishing & Exploding Gradient explained | A problem resulting from backpropagation

March 23, 2018 by


Let's discuss a problem that creeps up time-and-time during the training process of an artificial neural network. This is the problem of unstable gradients, and is most popularly referred to as the vanishing gradient problem. We’re first going to answer the question, what is the vanishing gradient problem, anyway? Here, we’ll cover the idea conceptually. We’ll then move our discussion to talking about how this problem occurs. Then, with the understanding that we’ll have developed up to this point, we’ll discuss the problem of exploding gradients, which we’ll see is actually very similar to the vanishing gradient problem, and so we’ll be able to take what we learned about that problem and apply it to this new one. Follow deeplizard on Twitter: Follow deeplizard on Facebook: Follow deeplizard on Steemit: Follow deeplizard on Instagram: Become a patron: Support deeplizard: Bitcoin: 1AFgm3fLTiG5pNPgnfkKdsktgxLCMYpxCN Litecoin: LTZ2AUGpDmFm85y89PFFvVR5QmfX6Rfzg3 Ether: 0x9105cd0ecbc921ad19f6d5f9dd249735da8269ef Recommended books: The Most Human Human: What Artificial Intelligence Teaches Us About Being Alive: