From the course: Deep Learning with Python: Optimizing Deep Learning Models

Unlock this course with a free trial

Join today to access over 25,500 courses taught by industry experts.

Key hyperparameters in deep learning

Key hyperparameters in deep learning

- [Narrator] In deep learning, hyperparameters are external settings that define how a neural network is structured and trained. Unlike parameters which are learned during training, hyperparameters are set before training begins and have a significant impact on the model's ability to learn and generalize. Selecting the right type of parameters is crucial for ensuring efficient training, robust performance, and optimal internalization to unseen data. One of the most important type of parameters is the learning rate, which controls how much the model adjusts its weight during each training iteration. A learning rate that is too high can cause a model to overshoot the optimal values, leading to unstable training, while a learning rate that is too low, results in slow convergence and may trap the model in a suboptimal solution. It's common to start with a modest learning rate, such as 0.001, especially when using optimizers like Adam. This value serves as a stable baseline that often…

Contents