Explore a mathematical lecture that delves into the minimum entropy properties of log-concave random variables with fixed variance, demonstrating how an exponential random variable achieves this minimum in Shannon differential entropy. Learn about practical applications in deriving upper bounds for additive noise channel capacities with log-concave noise, and discover improved constants in reverse entropy power inequalities for log-concave random variables. Based on collaborative research, gain insights into advanced probability theory and its implications for information theory and channel capacity analysis.
Minimum Entropy of a Log-Concave Random Variable with Fixed Variance
Hausdorff Center for Mathematics via YouTube
Overview
Syllabus
Piotr Nayar: Minimum entropy of a log-concave random variable with fixed variance
Taught by
Hausdorff Center for Mathematics