Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Adversarial Examples Are Not Bugs, They Are Features

Yannic Kilcher via YouTube

Overview

Explore a 40-minute video lecture on the concept of adversarial examples in machine learning, presented by Yannic Kilcher. Delve into the groundbreaking research that challenges the conventional understanding of these examples as mere bugs, instead proposing they are inherent features of the data. Examine the theoretical framework of non-robust features, their prevalence in standard datasets, and the misalignment between human-specified robustness and data geometry. Gain insights into the creation of adversarial examples, their implications for AI systems, and address potential criticisms of this novel perspective.

Syllabus

Intro
What is an adversarial example
The fundamental idea
Feature definition
Experimental evidence
How do they create
Criticisms

Taught by

Yannic Kilcher

Reviews

Start your review of Adversarial Examples Are Not Bugs, They Are Features

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.