Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Boosting Large Language Models with Retrieval Augmented Generation

AICamp via YouTube

Overview

Learn how to enhance Large Language Models (LLMs) through Retrieval Augmented Generation (RAG) in this technical talk presented by Mary Grygleski from DataStax. Explore the limitations of pre-trained language foundation models like ChatGPT in accessing and manipulating up-to-date knowledge, and discover how RAG techniques can overcome these constraints by retrieving external data to augment prompts. Understand the cost-effectiveness and efficiency of RAG compared to pre-training or fine-tuning foundation models, and its role in reducing LLM hallucinations. Dive into practical implementation using an event-driven streaming approach with the open source LangStream library, learning how to integrate existing data streams into generative AI applications through prompt engineering and RAG patterns.

Syllabus

Boost LLMs with Retrieval Augmented Generation

Taught by

AICamp

Reviews

Start your review of Boosting Large Language Models with Retrieval Augmented Generation

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.