Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a conference talk from Haystack US 2024 where Doug Rosenoff, Director of Global Search Test Tools at Lexis Nexis, discusses the adaptation of a Human Relevance Testing Framework for evaluating generative AI search results and summaries. Learn how this modified framework enables rapid and frequent assessment of large, diverse corpora for search and summarization functions. Discover the process of transforming traditional search testing methods to accommodate generative AI, including new metrics developed and the resulting methodology outputs. Gain insights into generative search and summarization techniques, as well as potential testing use cases such as comparison and regression methods. Delve into Rosenoff's extensive experience in electronic publishing and research, including his work on patented search algorithms and automatic linking. This 47-minute presentation, brought to you by OpenSource Connections, offers valuable knowledge for professionals interested in advancing search and AI testing methodologies.