Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

OpenAI Fine-tuning vs Distillation - Techniques and Implementation

Trelis Research via YouTube

Overview

Explore the differences between OpenAI fine-tuning and distillation in this 23-minute video tutorial. Learn why these techniques are valuable and how distillation differs from fine-tuning. Follow along with a free Colab notebook to implement a simple distillation approach, including model selection, training question setup, evaluation, dataset generation, and fine-tuning. Discover advanced techniques for creating larger datasets and review comprehensive results from fine-tuning and distilling GPT-4 to GPT-4-mini. Gain insights on OpenAI Evals and Gemini fine-tuning, with additional resources provided for synthetic data preparation and model distillation from scratch.

Syllabus

Fine-tuning and Distilling with OpenAI
Video Overview and Colab Notebook
Why bother with distilling or fine-tuning?
How is distillation different to fine-tuning?
Simple Approach to Distillation
Installation, student and teacher model selection
Set up training questions
Setting up simple evaluation
Generating and storing a distilled dataset
Running fine-tuning on the distilled dataset
Evaluating the fine-tuned model
Advanced techniques to generate larger datasets
Results from comprehensive fine-tuning and distillation from got-4o to gpt-4o-mini
Notes on OpenAI Evals and Gemini fine-tuning

Taught by

Trelis Research

Reviews

Start your review of OpenAI Fine-tuning vs Distillation - Techniques and Implementation

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.