From the course: Deep Learning with Python: Optimizing Deep Learning Models
Unlock this course with a free trial
Join today to access over 25,500 courses taught by industry experts.
Batch normalization - Python Tutorial
From the course: Deep Learning with Python: Optimizing Deep Learning Models
Batch normalization
- [Instructor] In deep learning, as model parameters are updated during training, the distribution of input values in each layer could change as the model learns. This change known as internal covariate shift can slow down the learning process and make it more challenging. Batch normalization solves this by normalizing the inputs to each layer so that they have a consistent scale and distribution during training. Batch normalization operates in three main steps. First, it calculates the mean and variance of each feature in the mini batch. This gives a snapshot of how the inputs are distributed for that batch. Next, it normalizes the inputs to have a zero mean and standard deviation of one. This ensures that the inputs of the layer are standardized, making the model easier to train. Given a mini batch of input B, the normalization approach is represented mathematically as shown here, where Xi hat is a new standardized input, Xi is original input, B bar is the mean of the mini batch…
Contents
-
-
-
-
-
-
-
(Locked)
Batch normalization3m 5s
-
(Locked)
Applying batch normalization to a deep learning model2m 55s
-
(Locked)
Gradient clipping5m 10s
-
(Locked)
Applying gradient clipping to a deep learning model3m
-
(Locked)
Early stopping and checkpointing3m 23s
-
(Locked)
Learning rate scheduling4m 56s
-
(Locked)
Training a deep learning model using callbacks6m 13s
-
(Locked)
-