Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Knowledge Distillation in Efficient Machine Learning - Lecture 9

MIT HAN Lab via YouTube

Overview

Explore the concept of Knowledge Distillation in this 59-minute lecture from MIT's EfficientML.ai course (6.5940, Fall 2024). Delve into the intricacies of this machine learning technique as presented by Prof. Song Han. Gain insights into how knowledge can be transferred from larger, complex models to smaller, more efficient ones. Examine the principles, methodologies, and applications of Knowledge Distillation in the context of efficient machine learning. Access accompanying slides for a comprehensive understanding of the topic. Ideal for students, researchers, and professionals interested in advanced machine learning techniques and model optimization.

Syllabus

EfficientML.ai Lecture 9 - Knowledge Distillation (MIT 6.5940, Fall 2024, Zoom recording)

Taught by

MIT HAN Lab

Reviews

Start your review of Knowledge Distillation in Efficient Machine Learning - Lecture 9

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.