Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Robustness Should Not Be at Odds with Accuracy - A Statistical Learning Theory Perspective

Harvard CMSA via YouTube

Overview

Watch a 19-minute conference talk from the 2022 Symposium on Foundations of Responsible Computing where York University's Ruth Urner explores the relationship between adversarial robustness and accuracy in deep learning models. Examine how imperceptible perturbations can cause neural networks to make incorrect predictions, and discover why traditional approaches to adversarial robustness may conflict with accuracy requirements. Learn about a novel approach using locally adaptive robust loss that maintains consistency in 1-nearest neighbor classification under deterministic labels. Follow the systematic analysis of standard and robust Bayes classifiers, understand data-informed adaptive robustness radius, and explore how proper modeling of adversarial examples should align with underlying data-generating processes. Gain insights into why robustness and accuracy should be complementary rather than conflicting goals in machine learning systems.

Syllabus

Intro
Statistical Learning Theory
Adversarial Learning
Decomposition of the adversarial loss
Unexpected phenomena
Robustness at odds with accuracy
Choosing a suitable robustness parameter
The margin canonical Bayes predictor
Redefining the adversarial loss
Empirical adaptive robust loss
Adaptive robust data-augmentation
Adaptive data augmentation maintains consistency of
Concluding remarks

Taught by

Harvard CMSA

Reviews

Start your review of Robustness Should Not Be at Odds with Accuracy - A Statistical Learning Theory Perspective

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.