Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Convergence of Nearest Neighbor Classification - Sanjoy Dasgupta

Institute for Advanced Study via YouTube

Overview

Explore the convergence of nearest neighbor classification in this 49-minute Members' Seminar presented by Sanjoy Dasgupta from the University of California, San Diego. Delve into the nonparametric estimator, statistical learning theory setup, and consistency results under continuity. Examine universal consistency in RPA and metric spaces, smoothness and margin conditions, and accurate rates of convergence. Investigate tradeoffs in choosing k, adaptive NN classifiers, and nonparametric notions of margin. Conclude with open problems in the field of nearest neighbor classification.

Syllabus

Intro
Nearest neighbor
A nonparametric estimator
The data space
Statistical learning theory setup
Questions of interest
Consistency results under continuity
Universal consistency in RP
A key geometric fact
Universal consistency in metric spaces
Smoothness and margin conditions
A better smoothness condition for NN
Accurate rates of convergence under smoothness
Under the hood
Tradeoffs in choosing k
An adaptive NN classifier
A nonparametric notion of margin
Open problems

Taught by

Institute for Advanced Study

Reviews

Start your review of Convergence of Nearest Neighbor Classification - Sanjoy Dasgupta

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.