Deep Learning Course
Deep Learning Fundamentals - Classic Edition
Beginner Friendly Intuitive Explanations Mathematically Focused Theory Based
Level: Beginner
Instructor: Mandy
Open Course
Open Course

What's Included:
What you'll learn ...
Learn the relationship between machine learning and deep learning
Understand artificial neural networks and all their components
Gain an intuitive understanding of neural network training
Learn how neural networks are implemented in code
Understand fundamental network training concepts like learning rates and loss functions
Learn about potential problems that can arise during training and potential solutions
Gain an understanding of how to process data for neural networks
Learn about the different categories of datasets in deep learning: training, validation, test
Understand the different categories of learning: supervised, unsupervised, semi-supervised
Gain an understanding of all the components in a convolutional neural network (CNN)
Understand how CNNs detect patterns in image data
Learn how data is affected by zero padding and max pooling included in neural networks
Learn the mathematics behind backpropagation and how it's used during training
Gain an understanding of the learnable parameters present in a neural network
Learn how network training is improved by regularization and batch normalization
Part 1 - INTRO TO DEEP LEARNING
Section 1 - Artificial Neural Network Basics
Lesson #1

Deep Learning playlist overview & Machine Learning intro
Lesson #2

Deep Learning explained
Lesson #3

Artificial Neural Networks explained
Lesson #4

Layers in a Neural Network explained
Lesson #5

Activation Functions in a Neural Network explained
Lesson #6

Training a Neural Network explained
Lesson #7

How a Neural Network Learns explained
Lesson #8

Loss in a Neural Network explained
Lesson #9

Learning Rate in a Neural Network explained
Section 2 - Data Topics for Deep Learning
Lesson #10

Train, Test, & Validation Sets explained
Lesson #11

Predicting with a Neural Network explained
Lesson #12

Overfitting in a Neural Network explained
Lesson #13

Underfitting in a Neural Network explained
Lesson #14

Supervised Learning explained
Lesson #15

Unsupervised Learning explained
Lesson #16

Semi-supervised Learning explained
Lesson #17

Data Augmentation explained
Lesson #18

One-hot Encoding explained
Part 2 - DEEP LEARNING CONCEPTS
Section 1 - Convolutional Neural Networks (CNNs)
Lesson #19

Convolutional Neural Networks (CNNs) explained
Lesson #20

Visualizing Convolutional Filters from a CNN
Lesson #21

Zero Padding in Convolutional Neural Networks explained
Lesson #22

Max Pooling in Convolutional Neural Networks explained
Section 2 - Backpropagation
Lesson #23

Backpropagation explained | Part 1 - The intuition
Lesson #24

Backpropagation explained | Part 2 - The mathematical notation
Lesson #25

Backpropagation explained | Part 3 - Mathematical observations
Lesson #26

Backpropagation explained | Part 4 - Calculating the gradient
Lesson #27

Backpropagation explained | Part 5 - What puts the "back" in backprop?
Section 3 - Additional Deep Learning Concepts
Lesson #28

Vanishing & Exploding Gradient explained | A problem resulting from backpropagation
Lesson #29

Weight Initialization explained | A way to reduce the vanishing gradient problem
Lesson #30

Bias in an Artificial Neural Network explained | How bias impacts training
Lesson #31

Learnable Parameters in an Artificial Neural Network explained
Lesson #32

Learnable Parameters in a Convolutional Neural Network (CNN) explained
Lesson #33

Regularization in a Neural Network explained
Lesson #34

Batch Size in a Neural Network explained
Lesson #35

Fine-tuning a Neural Network explained
Lesson #36
