Huffman Codes - An Information Theory Perspective

Huffman Codes - An Information Theory Perspective

Reducible via YouTube Direct link

Intro

1 of 13

1 of 13

Intro

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Huffman Codes - An Information Theory Perspective

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Intro
  2. 2 Modeling Data Compression Problems
  3. 3 Measuring Information
  4. 4 Self-Information and Entropy
  5. 5 The Connection between Entropy and Compression
  6. 6 Shannon-Fano Coding
  7. 7 Huffman's Improvement
  8. 8 Huffman Coding Examples
  9. 9 Huffman Coding Implementation
  10. 10 Recap
  11. 11 At , the entropy was calculated with log base 10 instead of the expected log base 2,. The correct values should be HP = 1.49 bits and HP = 0.47 bits.
  12. 12 At , all logarithms should be negated, I totally forgot about the negative sign.
  13. 13 At , I should have said the least likely symbols should have the *longest encoding.

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.