Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Introduction to Language Modeling: From N-grams to Neural Networks

UofU Data Science via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Learn about language modeling fundamentals in this comprehensive lecture covering key concepts from padding techniques to advanced neural language models. Begin with a review of assignments before diving into padding methodologies and exploring the limitations of static embeddings, random initialization, and bag-of-words approaches. Trace the evolution of transformer models through to RLHF (Reinforcement Learning from Human Feedback), followed by an in-depth examination of N-gram language models. Conclude with an exploration of neural language models and their applications in modern natural language processing.

Syllabus

Recap / Assignments
Padding
Limitations of static embeddings & random init & BoW
Transformers to RLHF timeline
N-gram LMs
Neural LMs

Taught by

UofU Data Science

Reviews

Start your review of Introduction to Language Modeling: From N-grams to Neural Networks

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.