Overview
Syllabus
Intro
Modeling Data Compression Problems
Measuring Information
Self-Information and Entropy
The Connection between Entropy and Compression
Shannon-Fano Coding
Huffman's Improvement
Huffman Coding Examples
Huffman Coding Implementation
Recap
At , the entropy was calculated with log base 10 instead of the expected log base 2,. The correct values should be HP = 1.49 bits and HP = 0.47 bits.
At , all logarithms should be negated, I totally forgot about the negative sign.
At , I should have said the least likely symbols should have the *longest encoding.
Taught by
Reducible