Completed
Can neural networks be trained/computed?
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Foundations of Deep Learning and AI: Instabilities, Limitations, and Potential
Automatically move to the next video in the Classroom when playback concludes
- 1 Intro
- 2 The impact of deep learning is unprecedented
- 3 How do we determine the foundations of DL?
- 4 Instabilities in classification/decision problems
- 5 Al techniques replace doctors
- 6 Transforming image reconstruction with Al
- 7 Comparison with state-of-the-art
- 8 Instability of DL in Inverse Problems - MRI
- 9 The press reports on instabilities
- 10 Hilbert's program on the foundations of mathematics
- 11 Program on the foundations of DL and Al
- 12 Should we expect instabilities in deep learning?
- 13 The instabilities in classification cannot be cured
- 14 The mathematical setup
- 15 Trained DL NNs yield small error on training data
- 16 Universal instability theorem
- 17 Al-generated hallucinations and instability
- 18 Gaussian perturbations and AUTOMAP
- 19 Sharpness of Theorem 3
- 20 Can neural networks be trained/computed?
- 21 Kernel awareness in compressed sensing
- 22 Kernel awareness is essential
- 23 Worst case perturbations for AUTOMAP
- 24 Conclusion
- 25 New book coming