Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Unveiling Hidden Backdoors in Manifold Distribution Gaps

BIMSA via YouTube

Overview

Explore the critical security concern of backdoor attacks on deep neural networks in this 55-minute conference talk from ICBS2024. Delve into the innovative approach of separating classification models into manifold embedding and classifier components. Discover how mode mixture features within manifold distribution gaps can be exploited as backdoors to extend decision boundaries. Learn about a universal backdoor attack framework applicable across various data modalities, offering high explainability and stealthiness. Examine the effectiveness of this method on high-dimensional natural datasets and gain insights into the potential vulnerabilities of classification models.

Syllabus

Min Zhang: Unveiling Hidden Backdoors in Manifold Distribution Gaps #ICBS2024

Taught by

BIMSA

Reviews

Start your review of Unveiling Hidden Backdoors in Manifold Distribution Gaps

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.