Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the fascinating world of in-context learning in large language models through this 90-minute lecture by Tengyu Ma from Stanford University. Delve into the remarkable ability of these models to tackle downstream tasks by conditioning on prompts with input-output examples, without requiring parameter updates. Examine several research papers that provide theoretical explanations for in-context learning mechanisms using simplified data distributions. Gain valuable insights into this cutting-edge topic as part of the Special Year on Large Language Models and Transformers: Part 1 Boot Camp at the Simons Institute.