# Machine Learning & Deep Learning Fundamentals

with deeplizard.

## Backpropagation explained | Part 4 - Calculating the gradient

March 6, 2018 by

Description

We’re now on number 4 in our journey through understanding backpropagation. In our last video, we focused on how we can mathematically express certain facts about the training process. Now we’re going to be using these expressions to help us differentiate the loss of the neural network with respect to the weights.
Recall from our video that covered the intuition for backpropagation, that, for stochastic gradient descent to update the weights of the network, it first needs to calculate the gradient of the loss with respect to these weights. And calculating this gradient, is exactly what we’ll be focusing on in this video.
We’re first going to start out by checking out the equation that backprop uses to differentiate the loss with respect to weights in the network. We’ll see that this equation is made up of multiple terms, so next we’ll break down and focus on each of these terms individually. Lastly, we’ll take the results from each term and combine them to obtain the final result, which will be the gradient of the loss function.
💥🦎 DEEPLIZARD COMMUNITY RESOURCES 🦎💥
👉 Check out the blog post and other resources for this video:
🔗 https://deeplizard.com/learn/video/Zr5viAZGndE
💻 DOWNLOAD ACCESS TO CODE FILES
🤖 Available for members of the deeplizard hivemind:
🔗 https://www.patreon.com/posts/27743395
🧠 Support collective intelligence, join the deeplizard hivemind:
🔗 https://deeplizard.com/hivemind
🤜 Support collective intelligence, create a quiz question for this video:
🔗 https://deeplizard.com/create-quiz-question
🚀 Boost collective intelligence by sharing this video on social media!
❤️🦎 Special thanks to the following polymaths of the deeplizard hivemind:
Peder B. Helland
👀 Follow deeplizard:
Twitter: https://twitter.com/deeplizard
Facebook: https://www.facebook.com/Deeplizard-145413762948316
Patreon: https://www.patreon.com/deeplizard
YouTube: https://www.youtube.com/deeplizard
Instagram: https://www.instagram.com/deeplizard/
🎓 Other deeplizard courses:
Reinforcement Learning - https://deeplizard.com/learn/playlist/PLZbbT5o_s2xoWNVdDudn51XM8lOuZ_Njv
NN Programming - https://deeplizard.com/learn/playlist/PLZbbT5o_s2xrfNyHZsM6ufI0iZENK9xgG
DL Fundamentals - https://deeplizard.com/learn/playlist/PLZbbT5o_s2xq7LwI2y8_QtvuXZedL6tQU
Keras - https://deeplizard.com/learn/playlist/PLZbbT5o_s2xrwRnXk_yCPtnqqo4_u2YGL
TensorFlow.js - https://deeplizard.com/learn/playlist/PLZbbT5o_s2xr83l8w44N_g3pygvajLrJ-
Data Science - https://deeplizard.com/learn/playlist/PLZbbT5o_s2xrth-Cqs_R9-
Trading - https://deeplizard.com/learn/playlist/PLZbbT5o_s2xr17PqeytCKiCD-TJj89rII
🛒 Check out products deeplizard recommends on Amazon:
🔗 https://www.amazon.com/shop/deeplizard
📕 Get a FREE 30-day Audible trial and 2 FREE audio books using deeplizard’s link:
🔗 https://amzn.to/2yoqWRn
🎵 deeplizard uses music by Kevin MacLeod
🔗 https://www.youtube.com/channel/UCSZXFhRIx6b0dFX3xS8L1yQ
🔗 http://incompetech.com/
❤️ Please use the knowledge gained from deeplizard content for good, not evil.