Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Indian Institute of Science Bangalore

Concentration Inequalities

Indian Institute of Science Bangalore and NPTEL via Swayam

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
It is well-known that functions of large numbers of random quantities tend to behave rather predictably and ‘less randomly’ than their constituents. For instance, the laws of large numbers tell us that the average of many independent random variables is asymptotically the expected value; higher-order refinements such as the central limit theorem and large deviations techniques uncover the asymptotic rate at which this reduction in randomness takes place. However, if one is interested in sharper estimates, for the probability of deviation from the typical value, for a fixed number of observations, for functions other than the average, or for functions of dependent random variables, one must take recourse to more specific measure concentration bounds. Perhaps the most basic, nontrivial examples in this regard are the Markov and Chebyshev inequalities, which are encountered in a first course on probability. This graduate-level course on concentration inequalities will cover the basic material on this classic topic as well as introduce several advanced topics and techniques. The utility of the inequalities derived will be illustrated by drawing on applications from electrical engineering, computer science and statistics. A tentative list of topics is given below. 1. Introduction & motivation: Limit results and concentration bounds 2. Chernoff bounds: Hoeffding’s inequality, Bennett’s inequality, Bernstein’s inequality 3. Variance bounds: Efron-Stein inequality, Poincáre inequality 4. The entropy method and log Sobolev inequality 5. The transportation method 6. Isoperimetric inequalities 7. Other special topics PREREQUISITES : A course on either probability, random processes or measure theory. Basic mathematical maturity and working familiarity with probability calculations.

Syllabus

Week 1: Chernoff bounds Week 2: Concentration bounds for sums and other functions of independent random variables Week 3: Variance bounds for functions of independent random variables Week 4: The Entropy method for concentration inequalities Week 5: Entropy method (contd.) and Transportation method Week 6: Transportation method, isoperimetry and concentration Week 7: Log-Sobolev inequalities revisited Week 8: Concentration inequalities for sequential data

Taught by

Prof. Himanshu Tyagi, Prof. Aditya gopalan

Tags

Reviews

Start your review of Concentration Inequalities

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.