Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Stanford University

Context-based Arithmetic Coding and LLM Compression - Stanford EE274 Lecture 9

Stanford University via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore advanced data compression techniques in this Stanford University lecture focusing on context-based arithmetic coding and large language model (LLM) compression. Delve into the intricacies of these cutting-edge methods as presented by Professor Tsachy Weissman, an expert in electrical engineering, along with researchers Shubham Chandak and Pulkit Tandon. Gain insights into the theoretical foundations and practical applications of context-based arithmetic coding, and discover how these principles are applied to compress large language models. Follow along with the comprehensive course materials available on the Stanford Data Compression Class website, and consider enrolling in the full online course for a deeper understanding of data compression theory and its real-world implementations.

Syllabus

Stanford EE274: Data Compression I 2023 I Lecture 9 - Context-based AC & LLM Compression

Taught by

Stanford Online

Reviews

Start your review of Context-based Arithmetic Coding and LLM Compression - Stanford EE274 Lecture 9

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.