Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Minimum Entropy of a Log-Concave Random Variable with Fixed Variance

Hausdorff Center for Mathematics via YouTube

Overview

Explore a mathematical lecture that delves into the minimum entropy properties of log-concave random variables with fixed variance, demonstrating how an exponential random variable achieves this minimum in Shannon differential entropy. Learn about practical applications in deriving upper bounds for additive noise channel capacities with log-concave noise, and discover improved constants in reverse entropy power inequalities for log-concave random variables. Based on collaborative research, gain insights into advanced probability theory and its implications for information theory and channel capacity analysis.

Syllabus

Piotr Nayar: Minimum entropy of a log-concave random variable with fixed variance

Taught by

Hausdorff Center for Mathematics

Reviews

Start your review of Minimum Entropy of a Log-Concave Random Variable with Fixed Variance

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.