Fine-tuning LLMs - Every Step Explained for Memorization Tasks

Fine-tuning LLMs - Every Step Explained for Memorization Tasks

Trelis Research via YouTube Direct link

Choosing the best batch size

6 of 14

6 of 14

Choosing the best batch size

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Fine-tuning LLMs - Every Step Explained for Memorization Tasks

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Fine-tuning on a custom dataset
  2. 2 Video Overview
  3. 3 GPTs as statistical models
  4. 4 What is the reversal curse?
  5. 5 Synthetic dataset generation
  6. 6 Choosing the best batch size
  7. 7 What learning rate to use for fine-tuning?
  8. 8 How many epochs to train for?
  9. 9 Choosing the right base model
  10. 10 Step by step dataset generation
  11. 11 Fine-tuning script, step-by-step
  12. 12 Performance Ablation: Hyperparameters
  13. 13 Performance Ablation: Base Models
  14. 14 Final Recommendations for Fine-tuning for Memorization

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.