Concentration Functions and Entropy Bounds for Discrete Log-Concave Distributions
Hausdorff Center for Mathematics via YouTube
Overview
Explore concentration functions and entropy bounds for discrete log-concave distributions in this 44-minute lecture. Delve into two-sided bounds and their applications in deriving variants of entropy power inequalities. Learn about the collaborative research conducted with Arnaud Marsiglietti and James Melbourne, focusing on Renyi entropies within the class of discrete log-concave probability distributions.
Syllabus
Sergey Bobkov: Concentration functions and entropy bounds for discrete log-concave distributions
Taught by
Hausdorff Center for Mathematics