Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Stanford University

Data Compression I - Lecture 3: Kraft Inequality, Entropy, and Introduction to SCL

Stanford University via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the fundamental concepts of data compression in this lecture from Stanford University's EE274 course. Delve into the Kraft Inequality, a crucial theorem in information theory that relates codeword lengths to probability distributions. Examine the concept of entropy, which quantifies the average amount of information in a message. Get an introduction to Source Coding with Large Alphabets (SCL), a technique for efficiently compressing data with large symbol sets. Follow along with Professor Tsachy Weissman, Shubham Chandak, and Pulkit Tandon as they guide you through these essential topics in data compression theory and applications. Access additional course materials and enrollment information through the provided links to enhance your learning experience in this comprehensive electrical engineering program.

Syllabus

Stanford EE274: Data Compression I 2023 I Lecture 3 - Kraft Inequality, Entropy, Introduction to SCL

Taught by

Stanford Online

Reviews

Start your review of Data Compression I - Lecture 3: Kraft Inequality, Entropy, and Introduction to SCL

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.