Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift

Yannic Kilcher via YouTube

Overview

Explore a comprehensive video analysis of the groundbreaking paper "Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift." Delve into the concept of internal covariate shift and its impact on deep neural network training. Learn how batch normalization addresses this issue by normalizing layer inputs, allowing for higher learning rates and less careful parameter initialization. Discover how this technique acts as a regularizer, potentially eliminating the need for dropout. Examine the impressive results achieved when applying batch normalization to state-of-the-art image classification models, including significant improvements in training speed and accuracy. Gain insights into the paper's methodology, implementation details, and its impact on the field of deep learning.

Syllabus

Introduction
What is Batch Normalization
Training
Back Propagation
Results

Taught by

Yannic Kilcher

Reviews

Start your review of Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.