Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Fine-tuning Optimizations - DoRA, NEFT, LoRA+, and Unsloth

Trelis Research via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore advanced fine-tuning optimization techniques for large language models in this comprehensive video tutorial. Delve into the intricacies of LoRA (Low-Rank Adaptation) and its improvements, including DoRA (Double-Rank Adaptation), NEFT (Noisy Embeddings for Fine-Tuning), LoRA+, and Unsloth. Learn how these methods work, their advantages, and practical implementations through detailed explanations and notebook walk-throughs. Compare the effectiveness of each technique and gain insights on choosing the best approach for your fine-tuning needs. Access provided resources, including GitHub repositories, slides, and research papers, to further enhance your understanding and application of these cutting-edge optimization strategies.

Syllabus

Improving on LoRA
Video Overview
How does LoRA work?
Understanding DoRA
NEFT - Adding Noise to Embeddings
LoRA Plus
Unsloth for fine-tuning speedups
Comparing LoRA+, Unsloth, DoRA, NEFT
Notebook Setup and LoRA
DoRA Notebook Walk-through
NEFT Notebook Example
LoRA Plus
Unsloth
Final Recommendation

Taught by

Trelis Research

Reviews

Start your review of Fine-tuning Optimizations - DoRA, NEFT, LoRA+, and Unsloth

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.