Overview
Explore the Information Bottleneck Theory of Deep Neural Networks in this lecture by Naftali Tishby from the Hebrew University of Jerusalem. Delve into statistical learning theory, neural network applications, and information theory. Examine concepts such as soft partitioning, information plan, and stochastic gradient descent. Analyze the average per layer, classical theory, dimensionality, confidence, factorization, cardinality, and the ultimate bound. Gain insights into targeted discovery in brain data and expand your understanding of deep neural networks through this comprehensive presentation from the Simons Institute.
Syllabus
Intro
Statistical Learning Theory
Neural Network Applications
Information Theory
Soft Partitioning
Information Plan
Stochastic Gradient Descent
Average Per Layer
Classical Theory
Dimensionality
Confidence
Factorization
Cardinality
The Ultimate Bound
Taught by
Simons Institute