From the course: Deep Learning with Python: Optimizing Deep Learning Models
Unlock this course with a free trial
Join today to access over 25,500 courses taught by industry experts.
Applying gradient clipping to a deep learning model - Python Tutorial
From the course: Deep Learning with Python: Optimizing Deep Learning Models
Applying gradient clipping to a deep learning model
- In this video, you will learn how to apply gradient clipping to a deep learning model in Python. I'll be writing the code in the 05_04 E file. You can follow along by completing the empty code cells in the 05_04 B file. Note that this is the second in a three-video sequence that teaches you how to apply batch normalization, gradient clipping, early stopping, and learning-rate scheduling to a deep learning model. If you have not done so, watch the video on how to apply batch normalization to a deep learning model for a detailed explanation of the prior code. Before we begin, let's run the code we created in that video to get our environment up to speed. So, the first thing I need to do is specify my kernel, get my Python environment, 3.10. I'm going to click on my next code cell and say run all above. I'm going to scroll up a little bit to make sure I know when things are done. All right. So now that we've defined our model's architecture, let's compile it by specifying the…
Contents
-
-
-
-
-
-
-
(Locked)
Batch normalization3m 5s
-
(Locked)
Applying batch normalization to a deep learning model2m 55s
-
(Locked)
Gradient clipping5m 10s
-
(Locked)
Applying gradient clipping to a deep learning model3m
-
(Locked)
Early stopping and checkpointing3m 23s
-
(Locked)
Learning rate scheduling4m 56s
-
(Locked)
Training a deep learning model using callbacks6m 13s
-
(Locked)
-