Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

RAG vs PEFT - Comparing LLM Domain Adaptation Methods

SK AI SUMMIT 2024 via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a technical conference presentation from SK TECH SUMMIT 2023 that delves into comparing two key methods for adapting Large Language Models (LLMs) to specific domains: Retrieval-Augmented Generation (RAG) and Parameter-Efficient Fine-Tuning (PEFT). Learn from SK Broadband's practical experience implementing these approaches using internal data sources like customer service manuals and legal documentation. Gain valuable insights into building AWS environments for LLM/sLLM model deployment and developing domain-specific LLM models. The speaker, Hyunseok Kim from SK Broadband, brings expertise from autonomous vehicle research and semiconductor industry experience, currently working as a data scientist focusing on NLP applications for content metadata discovery and viewing prediction tasks.

Syllabus

[SK TECH SUMMIT 2023] LLM 적용 방법인 RAG VS PEFT, Domain 적용 승자는?

Taught by

SK AI SUMMIT 2024

Reviews

Start your review of RAG vs PEFT - Comparing LLM Domain Adaptation Methods

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.