Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Optimisation

Alfredo Canziani via YouTube

Overview

Explore a comprehensive lecture on optimization techniques in deep learning, covering gradient descent, stochastic gradient descent (SGD), and momentum updates. Delve into adaptive methods like RMSprop and ADAM, and understand the impact of normalization layers on neural network training. Learn about the intuition behind these concepts, their performance comparisons, and their effects on convergence. Discover a real-world application of neural networks in accelerating MRI scans, demonstrating the practical implications of optimization in industry.

Syllabus

– Week 5 – Lecture
– Gradient Descent
– Stochastic Gradient Descent
– Momentum
– Adaptive Methods
– Normalization Layers
– The Death of Optimization

Taught by

Alfredo Canziani

Reviews

Start your review of Optimisation

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.