Completed
Layer 2 Filters
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Regularization of Big Neural Networks
Automatically move to the next video in the Classroom when playback concludes
- 1 Intro
- 2 Big Neural Nets
- 3 Big Models Over-Fitting
- 4 Training with DropOut
- 5 DropOut/Connect Intuition
- 6 Theoretical Analysis of DropConnect
- 7 MNIST Results
- 8 Varying Size of Network
- 9 Varying Fraction Dropped
- 10 Comparison of Convergence Rates
- 11 Limitations of DropOut/Connect
- 12 Stochastic Pooling
- 13 Methods for Test Time
- 14 Varying Size of Training Set
- 15 Convergence / Over-Fitting
- 16 Street View House Numbers
- 17 Deconvolutional Networks
- 18 Recap: Sparse Coding (Patch-based)
- 19 Reversible Max Pooling
- 20 Single Layer Cost Function
- 21 Single Layer Inference
- 22 Effect of Sparsity
- 23 Effect of Pooling Variables
- 24 Talk Overview
- 25 Stacking the Layers
- 26 Two Layer Example
- 27 Link to Parts and Structure Models
- 28 Caltech 101 Experiments
- 29 Layer 2 Filters
- 30 Classification Results: Caltech 101
- 31 Deconvolutional + Convolutional
- 32 Summary