Overview
Syllabus
Intro
Deep Learning: Theory vs Practice
Composition is Dynamics
Supervised Learning
The Problem of Approximation
Example: Approximation by Trigonometric Polynomials
The Continuum Idealization of Residual Networks
How do dynamics approximate functions?
Universal Approximation by Dynamics
Approximation of Symmetric Functions by Dynamical Hypothesis Spac
Sequence Modelling Applications
DL Architectures for Sequence Modelling
Modelling Static vs Dynamic Relationships
An Approximation Theory for Sequence Modelling
The Recurrent Neural Network Hypothesis Space
The Linear RNN Hypothesis Space
Properties of Linear RNN Hypothesis Space
Approximation Guarantee (Density)
Smoothness and Memory
Insights on the (Linear) RNN Hypothesis Space
Convolutional Architectures
Encoder-Decoder Architectures
Extending the RNN Analysis
Taught by
Fields Institute