Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

University of Oxford

Information Theory: Defining Entropy and Information - Lecture 1

via

Overview

Coursera Plus Annual Sale: All Certificates & Courses 25% Off!
Explore fundamental concepts of Information Theory in this Oxford Mathematics third-year undergraduate lecture focusing on measuring information content in random variables through entropy. Delve into Sam Cohen's detailed explanation of how to quantify the amount of information gained from observing random variable outcomes, introducing key concepts of entropy and related mathematical quantities. Learn through this 54-minute lecture, which is part of an eight-lecture series, and gain insights into advanced mathematical principles taught at the University of Oxford. Part of the comprehensive Oxford Mathematics curriculum where students typically follow up lectures with small-group tutorial sessions to deepen their understanding through problem-solving and mathematical discussions.

Syllabus

Information Theory: Defining Entropy and Information - Oxford Mathematics 3rd Year Student Lecture

Taught by

Oxford Mathematics

Reviews

Start your review of Information Theory: Defining Entropy and Information - Lecture 1

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.