Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore various methods for optimizing machine learning models and learn how to adjust hyper-parameters to minimize the cost function in this 31-minute video. Delve into key concepts including loss functions, gradient descent, learning rates, and their limitations. Discover the advantages of stochastic gradient descent and witness a practical demonstration. Gain valuable insights into optimization techniques that can enhance the performance of your machine learning models.
Syllabus
Introduction
What is optimization
Prerequisites
Loss Function
Gradient Descent Explained
Learning Rate Explained
Limitations of Gradient Descent
Stochastic Gradient Descent
Descent Gradient
Advantages
Demo
Conclusion
Spotlight
Taught by
NashKnolX