Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the intricacies of Retrieval-Augmented Generation (RAG) and its impact on Large Language Model (LLM) applications in this 28-minute conference talk. Delve into the fusion of retrieval and generation models, understanding how RAG enhances AI's text comprehension and response accuracy through information databases. Examine the challenges in evaluating LLM applications, particularly in domain-specific contexts, and discover essential strategies for assessing and optimizing RAG performance. Learn from Atita Arora, a Solution Architect at Qdrant and respected expert in information retrieval systems, as she shares insights on navigating challenges and improving LLM-based applications. Gain valuable knowledge on calibrating information retrieval systems, leveraging vectors in e-commerce search, and fostering diversity and inclusion in the tech industry.
Syllabus
Navigating Challenges and Enhancing Performance of LLM based Applications
Taught by
The ASF