Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

LoRA - Low Rank Adaptation for Parameter Efficient Fine-Tuning

CodeEmporium via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Learn about LoRA (Low-Rank Adaptation), a parameter-efficient fine-tuning technique for neural networks in this 29-minute educational video. Explore how low-rank matrices and adapters work together to reduce storage requirements and eliminate inference latency while maintaining model performance. Master the technical concepts through three comprehensive passes, starting with low-rank matrices, moving to adapters, and culminating in low-rank adapters. Engage with interactive quizzes after each section to reinforce understanding of these advanced machine learning concepts. Drawing from multiple research papers and practical implementations, discover how this innovative approach optimizes neural network training by decreasing the number of trainable parameters per task while preserving the existing network architecture's efficiency.

Syllabus

Introduction
Pass 1: Low Rank Matrices
Quiz 1
Pass 2: Adapters
Quiz 2
Pass 3: Low Rank Adapters
Quiz 3
Summary

Taught by

CodeEmporium

Reviews

Start your review of LoRA - Low Rank Adaptation for Parameter Efficient Fine-Tuning

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.