Completed
The Linear RNN Hypothesis Space
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Approximation Theory of Deep Learning from the Dynamical Viewpoint
Automatically move to the next video in the Classroom when playback concludes
- 1 Intro
- 2 Deep Learning: Theory vs Practice
- 3 Composition is Dynamics
- 4 Supervised Learning
- 5 The Problem of Approximation
- 6 Example: Approximation by Trigonometric Polynomials
- 7 The Continuum Idealization of Residual Networks
- 8 How do dynamics approximate functions?
- 9 Universal Approximation by Dynamics
- 10 Approximation of Symmetric Functions by Dynamical Hypothesis Spac
- 11 Sequence Modelling Applications
- 12 DL Architectures for Sequence Modelling
- 13 Modelling Static vs Dynamic Relationships
- 14 An Approximation Theory for Sequence Modelling
- 15 The Recurrent Neural Network Hypothesis Space
- 16 The Linear RNN Hypothesis Space
- 17 Properties of Linear RNN Hypothesis Space
- 18 Approximation Guarantee (Density)
- 19 Smoothness and Memory
- 20 Insights on the (Linear) RNN Hypothesis Space
- 21 Convolutional Architectures
- 22 Encoder-Decoder Architectures
- 23 Extending the RNN Analysis