video

Deep Learning Course - Level: Beginner
We've now seen the various gradient descent based optimizers, which differ in how often they implement weight updates during training, as well as how much of the data set is considered in each weight update.
There are additional optimizers which are also based on gradient descent but have some additional improvements that allow the model to converge more quickly and accurately onto a global loss minimum.
Committed by on