Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Knowledge Distillation in Efficient Machine Learning - Lecture 9

MIT HAN Lab via YouTube

Overview

Explore knowledge distillation techniques in this 58-minute lecture from MIT's EfficientML.ai course (6.5940, Fall 2024). Delve into the principles and applications of knowledge distillation as presented by Prof. Song Han from the MIT HAN Lab. Gain insights into how this technique can be used to transfer knowledge from larger, more complex models to smaller, more efficient ones. Access accompanying slides at efficientml.ai to enhance your understanding of this crucial topic in machine learning optimization.

Syllabus

EfficientML.ai Lecture 9 - Knowledge Distillation (MIT 6.5940, Fall 2024)

Taught by

MIT HAN Lab

Reviews

Start your review of Knowledge Distillation in Efficient Machine Learning - Lecture 9

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.