Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

LLMs- Consider Hallucinatory Unless Proven Otherwise

Pinecone via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the path towards a hallucination-free future in this 22-minute conference talk by Shayak Sen, CTO of TruEra. Delve into the world of Large Language Models (LLMs) and their generalization capabilities. Discover the RAG Triad and its significance in improving LLM performance. Learn about TruLens and the importance of feedback functions in enhancing model accuracy. Gain insights into query planning strategies and best practices for implementing these techniques. The talk concludes with a Q&A session, providing additional insights into the topic. This presentation offers valuable knowledge for those interested in advancing LLM technology and reducing hallucinations in AI-generated content.

Syllabus

Introduction
LLMs and Generalization
The RAG Triad
TruLens
Feedback Functions
Getting This Right
Query Planning
Conclusion
Q&A

Taught by

Pinecone

Reviews

Start your review of LLMs- Consider Hallucinatory Unless Proven Otherwise

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.