Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Manifold Learning with Noisy Data: Dimension Reduction and Support Estimation

Institut des Hautes Etudes Scientifiques (IHES) via YouTube

Overview

Explore the intricacies of manifold learning with noisy data in this 43-minute lecture by Elisabeth Gassiat from LMO/Université Paris-Saclay. Delve into a general framework for recovering low-dimensional non-linear structures from high-dimensional data contaminated with significant, unknown noise. Examine minimax rates for support estimation using Hausdorff distance. Cover topics including dimension reduction, manifold learning concepts, geometric approaches to noisy data, additive noise examples, and support estimation through deconvolution with Gaussian noise. Investigate identifiability theorems, geometric conditions for high-dimensional data, and practical examples of supports. Gain valuable insights into estimation upper bounds and leave with a comprehensive understanding of manifold learning in the presence of noisy data.

Syllabus

Intro
Dimension reduction
Manifold learning: some ideas (no noise)
Noisy data. What happens with noise? Geometric ideas
Additive noise: examples
Support estimation: deconvolution with Gaussian noise and (truncated) Hausdorff loss
Robustness to the assumptions on the noise
First question: identifiability
Identifiability theorem
When does HD hold? Simple facts.
When does HD hold? Geometrical condition
When does HD hold? Examples of supports
Second question: estimation (upper bound)
Take-home message

Taught by

Institut des Hautes Etudes Scientifiques (IHES)

Reviews

Start your review of Manifold Learning with Noisy Data: Dimension Reduction and Support Estimation

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.