Machine Learning & Deep Learning Fundamentals

with deeplizard.

Weight Initialization explained | A way to reduce the vanishing gradient problem

March 30, 2018 by


Let's talk about how the weights in an artificial neural network are initialized, how this initialization affects the training process, and what YOU can do about it! To kick off our discussion on weight initialization, we’re first going to discuss how these weights are initialized, and how these initialized values might negatively affect the training process. We’ll see that these randomly initialized weights actually contribute to the vanishing and exploding gradient problem we covered in the last video. With this in mind, we’ll then explore what we can do to influence how this initialization occurs. We’ll see how Xavier initialization (also called Glorot initialization) can help combat this problem. Then, we’ll see how we can specify how the weights for a given model are initialized in code using the kernel_initializer parameter for a given layer in Keras. Reference to original paper by Xavier Glorot and Yoshua Bengio: 💥🦎 DEEPLIZARD COMMUNITY RESOURCES 🦎💥 👉 Check out the blog post and other resources for this video: 🔗 💻 DOWNLOAD ACCESS TO CODE FILES 🤖 Available for members of the deeplizard hivemind: 🔗 🧠 Support collective intelligence, join the deeplizard hivemind: 🔗 🤜 Support collective intelligence, create a quiz question for this video: 🔗 🚀 Boost collective intelligence by sharing this video on social media! ❤️🦎 Special thanks to the following polymaths of the deeplizard hivemind: Peder B. Helland 👀 Follow deeplizard: Twitter: Facebook: Patreon: YouTube: Instagram: 🎓 Other deeplizard courses: Reinforcement Learning - NN Programming - DL Fundamentals - Keras - TensorFlow.js - Data Science - Trading - 🛒 Check out products deeplizard recommends on Amazon: 🔗 📕 Get a FREE 30-day Audible trial and 2 FREE audio books using deeplizard’s link: 🔗 🎵 deeplizard uses music by Kevin MacLeod 🔗 🔗 ❤️ Please use the knowledge gained from deeplizard content for good, not evil.