Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Counterfactual Inference with Unobserved Confounding via Exponential Family

Harvard CMSA via YouTube

Overview

Watch a 44-minute lecture from Harvard CMSA featuring MIT professor Devavrat Shah exploring counterfactual inference with unobserved confounding through exponential family modeling. Learn about personalized decision-making challenges in recommender systems, focusing on how to infer user engagement with different recommendation sequences while accounting for unobserved factors. Discover a computationally efficient method for learning distribution parameters with estimation error scaling linearly with metric entropy. Explore sufficient conditions for compactly supported distributions satisfying logarithmic Sobolev inequality, and understand the application of these concepts in sequential recommender systems, measurement error imputation, and undirected graphical models. The lecture covers theoretical foundations including maximum likelihood estimation, proper loss functions, and parameter estimation techniques for handling heterogeneous users with single trajectory observations.

Syllabus

Intro
Sequential Recommender System
Challenges
Problem Setup
Our Approach
Inference Tasks
An Application Imputing Measurement Error
Remainder Of The Talk The Loss Function, And Why It Works
Maximum Likelihood Estimation
An Alternative
A Proper Loss Function
Proof
Undirected Graphical Model
Back To Our Setting
In Summary: Parameter Estimation_

Taught by

Harvard CMSA

Reviews

Start your review of Counterfactual Inference with Unobserved Confounding via Exponential Family

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.