Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Probabilistic Methods for Classification - 2009

Center for Language & Speech Processing(CLSP), JHU via YouTube

Overview

Explore probabilistic methods for classification in this comprehensive lecture by Gideon Mann from the Center for Language & Speech Processing at Johns Hopkins University. Delve into supervised machine learning techniques, covering topics such as information extraction, semisupervised learning, and document classification. Learn about Naive Bayes, maximum likelihood estimation, and conditional log-linear models. Examine graphical models, including Maximum Entropy Models and Conditional Random Fields. Understand gradient-based optimization, hidden Markov models, and dependency parsing. Investigate advanced concepts like the Generalized Expectations Criteria, KL Divergence, and label regularization. Gain valuable insights into the theoretical foundations and practical applications of probabilistic classification methods in natural language processing and machine learning.

Syllabus

Introduction
Information Extraction
Semisupervised Learning
Outline
Supervised Machine Learning
Estimation
Classification
Document Classification
Naive Base
Maximum likelihood estimation
Sum over data
Recap
Conditional Log Linear Models
Graphical Models
Maximum Entropy Models
GradientBased Optimization
Naive Phase vs Maximum Entropy
Conditional Random Field
Hidden Markov Model
Model Framework
Model Structure
Conditional Random Field Models
Dependency Parsing
Generalized Expectations Criteria
KL Divergence
GE Estimation
Label Regularization

Taught by

Center for Language & Speech Processing(CLSP), JHU

Reviews

Start your review of Probabilistic Methods for Classification - 2009

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.