Machine Learning & Deep Learning Fundamentals

with deeplizard.

Backpropagation explained | Part 5 - What puts the "back" in backprop?

March 14, 2018 by

Description

Let's see the math that explains how backpropagation works backwards through a neural network. We’ve seen how to calculate the gradient of the loss function using backpropagation in the previous video. We haven’t yet seen though where the backwards movement comes into play that we talked about when we discussed the intuition for backprop. So now, we’re going to build on the knowledge that we’ve already developed to understand what exactly puts the back in backpropagation. The explanation we’ll give for this will be math-based, so we’re first going to start out by exploring the motivation needed for us to understand the calculations we’ll be working through. We’ll then jump right into the calculations, which, we’ll see, are actually quite similar to ones we’ve worked through in the previous video. After we’ve got the math down, we’ll then bring everything together to achieve the mind-blowing realization for how these calculations are mathematically done in a backwards fashion. Follow deeplizard on Twitter: https://twitter.com/deeplizard Follow deeplizard on Steemit: https://steemit.com/@deeplizard Become a patron: https://www.patreon.com/deeplizard Support deeplizard: Bitcoin: 1AFgm3fLTiG5pNPgnfkKdsktgxLCMYpxCN Litecoin: LTZ2AUGpDmFm85y89PFFvVR5QmfX6Rfzg3 Ether: 0x9105cd0ecbc921ad19f6d5f9dd249735da8269ef Recommended books: The Most Human Human: What Artificial Intelligence Teaches Us About Being Alive: http://amzn.to/2GtjKqu