Symbolic Knowledge Distillation- From General Language Models to Commonsense Models Explained

Symbolic Knowledge Distillation- From General Language Models to Commonsense Models Explained

Yannic Kilcher via YouTube Direct link

- Generating Events

7 of 14

7 of 14

- Generating Events

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Symbolic Knowledge Distillation- From General Language Models to Commonsense Models Explained

Automatically move to the next video in the Classroom when playback concludes

  1. 1 - Intro & Overview
  2. 2 - Sponsor: Weights & Biases
  3. 3 - Commonsense Knowledge Graphs
  4. 4 - ATOMIC dataset
  5. 5 - Generating the corpus from a model
  6. 6 - Prompting GPT-3
  7. 7 - Generating Events
  8. 8 - Generating Inferences
  9. 9 - Evaluating the created dataset
  10. 10 - Introducing the critic
  11. 11 - Using the critic to filter the data
  12. 12 - Training a student on the generated data
  13. 13 - Key Findings
  14. 14 - Comments & Conclusion

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.