Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Very Few Parameter Fine-Tuning with ReFT and LoRA

Trelis Research via YouTube

Overview

Explore advanced techniques for fine-tuning large language models with minimal parameters in this 55-minute video from Trelis Research. Delve into the intricacies of ReFT (Representation Fine-Tuning) and LoRA (Low-Rank Adaptation) methodologies, starting with a comprehensive review of transformer architecture. Learn the practical aspects of weight fine-tuning using LoRA, followed by an in-depth look at Representation Fine-tuning. Compare these two approaches and understand their respective strengths. Get hands-on experience with step-by-step walkthroughs for both LoRA and ReFT fine-tuning processes, including GPU setup considerations. Discover techniques for combining ReFT fine-tunes and explore the concept of orthogonality in fine-tuning. Gain insights into the limitations of LoReFT and LoRA fine-tuning, and conclude with valuable tips to enhance your fine-tuning skills. Access additional resources, including complete scripts, one-click fine-tuning templates, and community support to further your learning journey.

Syllabus

ReFT and LoRA Fine-tuning with few parameters
Video Overview
Transformer Architecture Review
Weight fine-tuning with LoRA
Representation Fine-tuning ReFT
Comparing LoRA with ReFT
Fine-tuning GPU setup
LoRA Fine-tuning walk-through
ReFT fine-tuning walk through
Combining ReFT fine-tunes
Orthogonality and combining fine-tunes
Limitations of LoReFT and LoRA fine-tuning
Fine-tuning tips

Taught by

Trelis Research

Reviews

Start your review of Very Few Parameter Fine-Tuning with ReFT and LoRA

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.