Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Personalizing Search Using Multimodal Latent Behavioral Embeddings

OpenSource Connections via YouTube

Overview

Explore the cutting-edge approach to personalizing search using multimodal latent behavioral embeddings in this 45-minute conference talk from Haystack US 2024. Delve into the importance of incorporating user context and behavioral signals to optimize search relevance, moving beyond traditional keyword-based and content embedding methods. Learn how to integrate user behavior into modern search retrieval pipelines for RAG and end-user search, combining content, domain, and user understanding for a holistic approach to search relevance. Discover techniques for training embedding models using behavioral signals, implementing personalized search experiences, and applying appropriate contextual guardrails. Gain insights into traditional signals-based models for AI-powered search and their mapping into multimodal embedding approaches. Witness live, open-source code examples demonstrating how modern hybrid search approaches can learn user and group affinities. Benefit from the expertise of Trey Grainger, lead author of "AI-Powered Search" and founder of Searchkernel, as he shares his extensive experience in developing semantic search, personalization, and recommendation systems.

Syllabus

Haystack US 2024 - Trey Grainger: Personalizing search using multimodal latent behavioral embeddings

Taught by

OpenSource Connections

Reviews

Start your review of Personalizing Search Using Multimodal Latent Behavioral Embeddings

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.