OpenAI Fine-tuning vs Distillation - Techniques and Implementation
Overview
Syllabus
Fine-tuning and Distilling with OpenAI
Video Overview and Colab Notebook
Why bother with distilling or fine-tuning?
How is distillation different to fine-tuning?
Simple Approach to Distillation
Installation, student and teacher model selection
Set up training questions
Setting up simple evaluation
Generating and storing a distilled dataset
Running fine-tuning on the distilled dataset
Evaluating the fine-tuned model
Advanced techniques to generate larger datasets
Results from comprehensive fine-tuning and distillation from got-4o to gpt-4o-mini
Notes on OpenAI Evals and Gemini fine-tuning
Taught by
Trelis Research