Completed
Intro
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Huffman Codes - An Information Theory Perspective
Automatically move to the next video in the Classroom when playback concludes
- 1 Intro
- 2 Modeling Data Compression Problems
- 3 Measuring Information
- 4 Self-Information and Entropy
- 5 The Connection between Entropy and Compression
- 6 Shannon-Fano Coding
- 7 Huffman's Improvement
- 8 Huffman Coding Examples
- 9 Huffman Coding Implementation
- 10 Recap
- 11 At , the entropy was calculated with log base 10 instead of the expected log base 2,. The correct values should be HP = 1.49 bits and HP = 0.47 bits.
- 12 At , all logarithms should be negated, I totally forgot about the negative sign.
- 13 At , I should have said the least likely symbols should have the *longest encoding.