Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

From Information Theory to Learning via Statistical Physics - Introduction

International Centre for Theoretical Sciences via YouTube

Overview

Explore the intersection of information theory, statistical physics, and machine learning in this comprehensive lecture by Florent Krzakala. Delve into topics such as classical statistics, high-dimensional statistics, signal processing, and regression, while examining their connections to statistical physics. Learn about Bayes rules, estimators, and Fisher information, and discover how these concepts apply to real-world problems. Investigate the relationship between statistical mechanics and machine learning, and understand the importance of Bayes risks in discrete problems. Gain insights into the interdisciplinary nature of these fields and their applications in solving complex physical and biological systems.

Syllabus

US-India Advanced Studies Institute: Classical and Quantum Information
From information theory to learning via Statistical physics: Introduction: Statistical learning, Bayes rules, estimators, and statistical physics
Topics
Connecting physics and information theory
Example 1: "Classical statistics"
Prove
Solve the problem
Assume uniform prior
Prove
Fischer information
Example 2: High dimension statistics
Signal processing
Regression
Statistical physics problem
Back to abasing formulation
Claim
Statistical mechanics
3. Estimated and base optimality
Bayes risks
Discrete problem
Summary

Taught by

International Centre for Theoretical Sciences

Reviews

Start your review of From Information Theory to Learning via Statistical Physics - Introduction

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.