Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

NFNets - High-Performance Large-Scale Image Recognition Without Normalization

Yannic Kilcher via YouTube

Overview

Explore a comprehensive video analysis of the research paper "NFNets: High-Performance Large-Scale Image Recognition Without Normalization" from Google DeepMind. Delve into the innovative approach of Normalizer-Free Networks, which achieve state-of-the-art classification accuracy on ImageNet without using batch normalization. Learn about the advantages and disadvantages of BatchNorm, and discover how adaptive gradient clipping (AGC) and architectural improvements enable NFNets to outperform traditional models. Gain insights into the benefits of this new technique, including faster training, improved accuracy, and enhanced transfer learning performance. Follow along as the video breaks down the paper's key contributions, compares NFNets to EfficientNet, and discusses the implications for future deep learning research.

Syllabus

- Intro & Overview
- What's the problem with BatchNorm?
- Paper contribution Overview
- Beneficial properties of BatchNorm
- Previous work: NF-ResNets
- Adaptive Gradient Clipping
- AGC and large batch size
- AGC induces implicit dependence between training samples
- Are BatchNorm's problems solved?
- Network architecture improvements
- Comparison to EfficientNet
- Conclusion & Comments

Taught by

Yannic Kilcher

Reviews

Start your review of NFNets - High-Performance Large-Scale Image Recognition Without Normalization

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.