Completed
Implication for deep ReLU neural networks
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Splines and Imaging - From Compressed Sensing to Deep Neural Networks
Automatically move to the next video in the Classroom when playback concludes
- 1 Intro
- 2 Variational formulation of inverse problem
- 3 Linear inverse problems (20th century theory)
- 4 Learning as a (linear) Inverse problem
- 5 Splines are analog, but intrinsically sparse
- 6 Spline synthesis example
- 7 Spline synthesis: generalization
- 8 Representer theorem for TV regularization
- 9 Other spline-admissible operators
- 10 Recovery with sparsity constraints: discretization
- 11 Structure of iterative reconstruction algorithm
- 12 Connection with deep neural networks
- 13 Deep neural networks and splines
- 14 Feedforward deep neural network
- 15 CPWL functions in high dimensions
- 16 Algebra of CPWL functions
- 17 Implication for deep ReLU neural networks
- 18 CPWL functions: further properties
- 19 Constraining activation functions
- 20 Representer theorem for deep neural networks
- 21 Outcome of representer theorem
- 22 Optimality results
- 23 Deep spline networks: Discussion
- 24 Deep spline networks (Cont'd)