Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Neural Nets for NLP - Class Introduction & Why Neural Nets?

Graham Neubig via YouTube

Overview

Explore the fundamentals of neural networks for natural language processing in this introductory lecture from CMU's Neural Networks for NLP course. Delve into example tasks and their challenges, discover how neural networks can address these issues, and gain insights into the basic concepts of neural network architectures for NLP prediction tasks. Learn about forward propagation, computation graphs, model parameters, and training processes using frameworks like DyNet. Examine the Continuous Bag of Words (CBOW) model and understand what vector representations signify in the context of NLP. Acquire essential knowledge to kickstart your journey into applying neural networks to natural language processing tasks.

Syllabus

Intro
Are These Sentences OK?
Engineering Solutions
Phenomena to Handle
An Example Prediction Problem: Sentence Classification
A First Try: Bag of Words (BOW)
Build It, Break It
Combination Features
Basic Idea of Neural Networks (for NLP Prediction Tasks)
An edge represents a function argument (and also an data dependency). They are just pointers to nodes
Algorithms (1)
Forward Propagation
Algorithms (2)
Basic Process in Dynamic Neural Network Frameworks
Computation Graph and Expressions
Model and Parameters
Parameter Initialization
Trainers and Backdrop
Training with DyNet
Continuous Bag of Words (CBOW) movie
What do Our Vectors Represent?
Things to Remember
Class Format

Taught by

Graham Neubig

Reviews

Start your review of Neural Nets for NLP - Class Introduction & Why Neural Nets?

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.