Completed
Notes on OpenAI Evals and Gemini fine-tuning
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
OpenAI Fine-tuning vs Distillation - Techniques and Implementation
Automatically move to the next video in the Classroom when playback concludes
- 1 Fine-tuning and Distilling with OpenAI
- 2 Video Overview and Colab Notebook
- 3 Why bother with distilling or fine-tuning?
- 4 How is distillation different to fine-tuning?
- 5 Simple Approach to Distillation
- 6 Installation, student and teacher model selection
- 7 Set up training questions
- 8 Setting up simple evaluation
- 9 Generating and storing a distilled dataset
- 10 Running fine-tuning on the distilled dataset
- 11 Evaluating the fine-tuned model
- 12 Advanced techniques to generate larger datasets
- 13 Results from comprehensive fine-tuning and distillation from got-4o to gpt-4o-mini
- 14 Notes on OpenAI Evals and Gemini fine-tuning