Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore an in-depth tutorial on FractalNet, an alternative to residual neural networks, demonstrating that the power of ResNet lies in training sub-paths of different lengths rather than its residual aspect. Dive into the FractalNet paper, understanding the fractal expansion rule, key variables, and the concept of drop path. Examine FractalNet's training process, datasets used, and impressive results. Follow along with a comprehensive code walkthrough, including detailed explanations of the FractalNet, FractalBlock, and ConvBlock classes, as well as the join and drop_mask functions. Gain insights into this innovative deep learning architecture that challenges conventional wisdom about residual networks and offers an "anytime" property for quick or more accurate predictions based on network depth.
Syllabus
- Introduction:
- FractalNet Paper:
- Fractal Expansion Rule:
- Important Variables for Fractal Network:
- Overview of Drop Path:
- FractalNet Training & Datasets:
- FractalNet Results:
- Code Overview:
- Code Introduction:
- FractalNet Class:
- FractalBlock Class:
- FractalBlock join function:
- FractalBlock drop_mask function:
- ConvBlock Class:
- Conclusion:
Taught by
Yacine Mahdid