Completed
Role of skip-connections
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Approximation with Deep Networks - Remi Gribonval, Inria
Automatically move to the next video in the Classroom when playback concludes
- 1 Introduction
- 2 Feedforward neural networks
- 3 Studying the expressivity of DNNS
- 4 Example: the ReLU activation function
- 5 ReLU networks
- 6 Universal approximation property
- 7 Why sparsely connected networks?
- 8 Same sparsity - various network shapes
- 9 Approximation with sparse networks
- 10 Direct vs inverse estimate
- 11 Notion of approximation space
- 12 Role of skip-connections
- 13 Counting neurons vs connections
- 14 Role of activation function 0
- 15 The case of spline activation functions Theorem 2
- 16 Guidelines to choose an activation ?
- 17 Rescaling equivalence with the ReLU
- 18 Benefits of depth ?
- 19 Role of depth
- 20 Set theoretic picture
- 21 Summary: Approximation with DNNS
- 22 Overall summary & perspectives