Understanding Text Generation Through Diffusion Models - From Theory to Implementation

Understanding Text Generation Through Diffusion Models - From Theory to Implementation

Oxen via YouTube Direct link

Solution #1: Train a Network to Approximate the Probability Mass Function

4 of 11

4 of 11

Solution #1: Train a Network to Approximate the Probability Mass Function

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Understanding Text Generation Through Diffusion Models - From Theory to Implementation

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Intro
  2. 2 Modeling Probability Distributions for Generative AI
  3. 3 Problem #1: No Black Box
  4. 4 Solution #1: Train a Network to Approximate the Probability Mass Function
  5. 5 Problem #2: The Normalizing Constant, Z_theta, is Intractable
  6. 6 Solution #2: Autoregressive Modeling
  7. 7 Solution #3 Real Solution: Model Score, Not Probability Mass
  8. 8 Learning the Concrete Score Through Diffusion
  9. 9 Evaluation
  10. 10 So What?
  11. 11 Takeaways

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.