OpenAI Fine-tuning vs Distillation - Techniques and Implementation

OpenAI Fine-tuning vs Distillation - Techniques and Implementation

Trelis Research via YouTube Direct link

Generating and storing a distilled dataset

9 of 14

9 of 14

Generating and storing a distilled dataset

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

OpenAI Fine-tuning vs Distillation - Techniques and Implementation

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Fine-tuning and Distilling with OpenAI
  2. 2 Video Overview and Colab Notebook
  3. 3 Why bother with distilling or fine-tuning?
  4. 4 How is distillation different to fine-tuning?
  5. 5 Simple Approach to Distillation
  6. 6 Installation, student and teacher model selection
  7. 7 Set up training questions
  8. 8 Setting up simple evaluation
  9. 9 Generating and storing a distilled dataset
  10. 10 Running fine-tuning on the distilled dataset
  11. 11 Evaluating the fine-tuned model
  12. 12 Advanced techniques to generate larger datasets
  13. 13 Results from comprehensive fine-tuning and distillation from got-4o to gpt-4o-mini
  14. 14 Notes on OpenAI Evals and Gemini fine-tuning

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.