Completed
Pretraining on more data
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
The Future of Natural Language Processing
Automatically move to the next video in the Classroom when playback concludes
- 1 Intro
- 2 Open questions, current trends, limits
- 3 Model size and Computational efficiency
- 4 Using more and more data
- 5 Pretraining on more data
- 6 Fine-tuning on more data
- 7 More data or better models
- 8 In-domain vs. out-of-domain generalization
- 9 The limits of NLU and the rise of NLG
- 10 Solutions to the lack of robustness
- 11 Reporting and evaluation issues
- 12 The inductive bias question
- 13 The common sense question