Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Generalization Theory in Machine Learning

Institute for Pure & Applied Mathematics (IPAM) via YouTube

Overview

Delve into the first part of a comprehensive lecture on generalization theory in machine learning, presented by Adam Oberman from McGill University at the Institute for Pure & Applied Mathematics (IPAM). Explore the foundations of statistical learning theory, its similarities to classical approximation theory, and how it overcomes the curse of dimensionality using concentration of measure inequalities. Examine learning bounds for traditional machine learning methods like support vector machines (SVMs) and kernel methods, while discussing the challenges in applying these bounds to deep neural networks. Gain insights into image classification, hypothesis classes, kernel methods, and the intricacies of statistical learning theory. Investigate the curse of dimensionality, the gap for learning, and various complexities in machine learning. This 71-minute lecture serves as an essential resource for those seeking to understand the theoretical underpinnings of machine learning and its applications in high-dimensional spaces.

Syllabus

Introduction
Traditional Machine Learning
Deep Learning
Deep Learning Everywhere
Image Classification
ImageNet
Classification
Classification Notation
Classification Loss
Hypothesis Classes
Kernel Methods
Gaussian Kernel
Quadratic Loss
Summary
Statistical Learning Theory
Curse of Dimensionality
Gap for Learning
Proof
First Inequality
Defining Complexity
Empirical Complexity
NonEmpirical Complexity
The Gap
McDermotts Inequality

Taught by

Institute for Pure & Applied Mathematics (IPAM)

Reviews

Start your review of Generalization Theory in Machine Learning

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.