Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Long-context Attention in Near-Linear Time

Simons Institute via YouTube

Overview

Explore a groundbreaking lecture on "Long-context Attention in Near-Linear Time" presented by David Woodruff from Carnegie Mellon University at the Simons Institute. Delve into the innovative "HyperAttention" mechanism, designed to tackle computational challenges in Large Language Models (LLMs) with extended contexts. Discover how this approach achieves linear time sampling, even with unbounded matrix entries or high stable rank, by introducing two key parameters. Learn about the modular design of HyperAttention and its compatibility with other fast implementations like FlashAttention. Examine the empirical performance of this technique across various long-context datasets, including its impressive impact on ChatGLM2's inference time and its significant speedup for larger context lengths. Gain insights into the collaborative research behind this advancement in the field of sublinear algorithms.

Syllabus

Long-context Attention in Near-Linear Time

Taught by

Simons Institute

Reviews

Start your review of Long-context Attention in Near-Linear Time

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.