Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Dirichlet Energy Minimization Explains In-Context Learning

Discover AI via YouTube

Overview

Explore groundbreaking Harvard University research on In-Context Learning (ICL) and Retrieval-Augmented Generation (RAG) in this 29-minute video presentation. Delve into new insights about the augmentation process and discover when augmented prompts can override an LLM's semantic prior. Learn about the emergence of phase transitions in LLM learning performance through the lens of Dirichlet Energy Minimization. Understand how to optimize transformer learning procedures for ICL without expensive fine-tuning or pre-training. Based on collaborative research from Harvard's CBS-NTT Program in Physics of Intelligence, Department of Physics, NTT Research Inc., SEAS, and the University of Michigan, this presentation offers practical approaches to improving both ICL and the augmentation component of RAG systems. Gain valuable insights into the mathematical foundations of how large language models learn and adapt to new information through context.

Syllabus

Dirichlet Energy Minimization Explains In-Context Learning (Harvard)

Taught by

Discover AI

Reviews

Start your review of Dirichlet Energy Minimization Explains In-Context Learning

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.