Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Huffman Codes - An Information Theory Perspective

Reducible via YouTube

Overview

Dive into the fascinating world of data compression with this comprehensive video on Huffman Codes from an information theory perspective. Explore the evolution of compression algorithms, starting from the foundational concepts of information theory to the groundbreaking discovery of Huffman Codes. Learn about modeling data compression problems, measuring information, self-information, and entropy. Understand the crucial connection between entropy and compression, and discover how Shannon-Fano coding paved the way for Huffman's improvement. Examine practical Huffman Coding examples and implementation techniques. Gain insights into the elegant simplicity of the Huffman algorithm and its significance in the field of data compression. Perfect for those interested in information theory, computer science, and the history of algorithmic breakthroughs.

Syllabus

Intro
Modeling Data Compression Problems
Measuring Information
Self-Information and Entropy
The Connection between Entropy and Compression
Shannon-Fano Coding
Huffman's Improvement
Huffman Coding Examples
Huffman Coding Implementation
Recap
At , the entropy was calculated with log base 10 instead of the expected log base 2,. The correct values should be HP = 1.49 bits and HP = 0.47 bits.
At , all logarithms should be negated, I totally forgot about the negative sign.
At , I should have said the least likely symbols should have the *longest encoding.

Taught by

Reducible

Reviews

Start your review of Huffman Codes - An Information Theory Perspective

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.