Inverse Problems under a Learned Generative Prior - Lecture 1

Inverse Problems under a Learned Generative Prior - Lecture 1

International Centre for Theoretical Sciences via YouTube Direct link

Q&A

26 of 26

26 of 26

Q&A

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Inverse Problems under a Learned Generative Prior - Lecture 1

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Inverse Problems under a Learned Generative Prior Lecture 1
  2. 2 Examples of inverse problem
  3. 3 A common prior: sparsity
  4. 4 Sparsity can be optimized via a convex relaxation
  5. 5 Recovery guarantee for sparse signals
  6. 6 Generative models learn to impressively sample from complex signal classes
  7. 7 How are generative models used in inverse problems?
  8. 8 Generative models provide SOTA performance
  9. 9 Deep Compressive Sensing
  10. 10 Initial theory for generative priors analyzed global minimizers, which may be hard to find
  11. 11 Random generative priors allow rigorous recovery guarantees
  12. 12 Compressive sensing with random generative prior has favorable geometry for optimization
  13. 13 Proof Outline
  14. 14 Deterministic Condition for Recovery
  15. 15 Compressive sensing with random generative prior has a provably convergent subgradient descent algorithm
  16. 16 Guarantees for compressive sensing under generative priors have been extended to convolutional architectures
  17. 17 Why can generative models outperform sparsity models?
  18. 18 Sparsity appears to fail in Compressive Phase Retrieval
  19. 19 Our formulation: Deep Phase Retrieval
  20. 20 Generative priors can be efficient exploited for compressive phase retrieval
  21. 21 Comparison on MNIST
  22. 22 New workflow for scientists
  23. 23 Concrete steps have already been taken
  24. 24 Further Theory Needed
  25. 25 Main takeaways
  26. 26 Q&A

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.