Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Interpretable Representation Learning for Visual Intelligence

Bolei Zhou via YouTube

Overview

Explore a comprehensive thesis defense presentation on interpretable representation learning for visual intelligence. Delve into deep neural networks for object classification, network visualization techniques, and interpretable representations for objects and scenes. Learn about class activation mapping for explaining deep neural network predictions, weakly-supervised localization, and temporal relational networks for event recognition. Gain insights into the interpretability of medical models and understand the contributions made to the field of visual intelligence.

Syllabus

Intro
Deep Neural Networks for Object Classification
Interpretability of Deep Neural Networks
Thesis Outline
Object Classification vs. Scene Recognition
Visualizing Units
Related Work on Network Visualization
Annotating the Interpretation of Units
Interpretable Representations for Objects and Scenes
Evaluate Unit for Semantic Segmentation
IMAGENET Pretrained Network
Class Activation Mapping: Explain Prediction of Deep Neural Network
Evaluation on Weakly-Supervised Localization
Explaining the Failure Cases in Video
Interpreting Medical Models
Summary of Contributions
Temporal Relational Networks for Event Recognition
Acknowledgement

Taught by

Bolei Zhou

Reviews

Start your review of Interpretable Representation Learning for Visual Intelligence

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.