Overview
Explore fine-tuning large language models (LLMs) for semantic search in this conference talk by Dr. Roman Grebennikov at MLcon Munich 2024. Delve into customizing models for specific domains such as medicine, law, or hardware using open-source tools like sentence-transformers and nixietune. Gain insights into data requirements, training bi-encoders and cross-encoders, and achieving quality improvements with a single GPU. Overcome limitations of standard semantic search methods, fine-tune models for better performance in specialized fields, and acquire practical knowledge on data needs and training techniques. Examine real-world examples of enhanced semantic search and learn how to boost semantic search capabilities with expert guidance. The 49-minute talk provides valuable information for those looking to improve their semantic search implementations in various specialized domains.
Syllabus
Practical LLM Fine Tuning For Semantic Search | Dr. Roman Grebennikov
Taught by
MLCon | Machine Learning Conference