A Cheap Trick for Semantic Question Answering for GPU-Challenged Systems
OpenSource Connections via YouTube
Overview
Explore a cost-effective approach to semantic question answering in this conference talk from Haystack US 2023. Learn how to combine Large Language Models (LLMs) during indexing to generate questions from passages, and match them to incoming queries during search using text-based or vector-based matching. Discover a method that addresses the challenges of high infrastructure costs and potential hallucinations associated with LLM-based search pipelines. Gain insights into designing efficient search systems that prioritize speed and affordability while maintaining quality. Presented by Sujit Pal, Technical Research Director at Elsevier Health Markets, this talk draws from his extensive experience in search engine development and the application of Machine Learning techniques to enhance search functionality.
Syllabus
Haystack US 2023 - Sujit Pal: A Cheap Trick for Semantic Question Answering for the GPU challenged
Taught by
OpenSource Connections