Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

EfficientML.ai: Quantization Part II - Lecture 6

MIT HAN Lab via YouTube

Overview

Dive into the second part of a comprehensive lecture on quantization in machine learning, delivered by Prof. Song Han as part of MIT's 6.5940 course for Fall 2023. Explore advanced concepts such as Rooney Quantization, scaling factors, convolution techniques, and post-training quantization methods. Learn about quantization granularity, per-channel quantization, and the importance of clipping in the quantization process. Discover how to select optimal clipping ranges and fine-tune quantized models. Examine weight and activation quantization through practical examples, and gain insights into binary and ternary quantization techniques, including stochastic binarization. Access accompanying slides at efficientml.ai for a deeper understanding of these cutting-edge quantization strategies in efficient machine learning.

Syllabus

Introduction
Outline
Agenda
Rooney Quantization
Original Weight
Scaling Factor
Convolution
Posttraining quantization
Quantization granularity
Perchannel quantization
Skating Factor
Clipping clipping clipping
Selecting clipping range
Fine tuning
Weight
Activation
Quantization Example
Quantization Notation
Quantization Results
Binary ternary quantization
Stochastic binarization

Taught by

MIT HAN Lab

Reviews

Start your review of EfficientML.ai: Quantization Part II - Lecture 6

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.