Embeddings for RAG - Understanding BERT and Sentence Transformers
Overview
Learn about embedding models for Retrieval Augmented Generation (RAG) in this 15-minute educational video that builds upon previous RAG series content covering data ingestion and PDF parsing. Explore the fundamentals of embeddings, starting with a clear explanation of what they are and their role in RAG systems. Progress through a structured hierarchy of embedding models, diving deep into Transformers architecture and its evolution. Master the concepts behind BERT (Bidirectional Encoder Representations from Transformers) and SBERT (Sentence-BERT), understanding their applications and advantages. Conclude with a practical hands-on demonstration using Sentence Transformers, gaining real-world implementation experience. Follow along with clearly marked timestamps for easy navigation through topics, from basic concepts to advanced applications in natural language processing and machine learning.
Syllabus
- Intro
- What is embedding?
- Hierarchy of embedding models
- Transformers
- BERT
- SBERT
- Hands-on Sentence Transformers
Taught by
AI Bites