Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

University of Central Florida

CAP6412 - SmoothGrad: Removing Noise by Adding Noise - Lecture

University of Central Florida via YouTube

Overview

Explore a comprehensive lecture on SmoothGrad, a technique for improving the quality of sensitivity maps in deep neural networks. Delve into the paper's details, starting with an overview and definition of sensitivity maps. Examine previous work in the field before focusing on the SmoothGrad proposal. Analyze various experiments, including models used, visualization techniques, parameter adjustments, and comparisons to baseline methods. Investigate the combination of SmoothGrad with other techniques and the effects of adding noise during training. Conclude with a critical evaluation of the paper's strengths and weaknesses, providing a well-rounded understanding of this innovative approach to reducing noise in neural network visualizations.

Syllabus

Paper Details
Overview
Definition of Sensitivity Maps
Previous Work
Smooth Grad Proposal
Experiments - Models
Experiments - Visualization (Value of gradients)
Experiments - Visualization (Capping Values)
Experiments - Visualization (Multiplying with Input)
Experiments - Parameters
Experiments - Comparison to Baseline Methods
Experiments - Combining SmoothGrad
Experiments - Adding Noise During Training
Conclusion
For Paper
Against Paper

Taught by

UCF CRCV

Reviews

Start your review of CAP6412 - SmoothGrad: Removing Noise by Adding Noise - Lecture

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.