Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Linear Structure of High-Level Concepts in Text-Controlled Generative Models

Valence Labs via YouTube

Overview

Explore the linear structure of high-level concepts in text-controlled generative models through this comprehensive talk by Victor Veitch from Valence Labs. Delve into the algebraic structure of vector representations in large language models and text-to-image diffusion models. Discover how natural language is embedded into vector representations and used for sampling from the model's output space. Examine the concept of "linear" representations, their emergence, and their application in understanding and controlling generative models with precision. Follow along as the speaker covers topics including the Linear Representation Hypothesis, language models, subspace notions, causal inner product, and related experiments. Gain insights from the conclusions and participate in the discussion to deepen your understanding of this complex subject in the field of artificial intelligence and machine learning.

Syllabus

- Discussant Slide + Introduction
- Linear Representation Hypothesis
- Language Models
- Subspace Notions
- Causal Inner Product
- Experiments
- Conclusions
- Discussion

Taught by

Valence Labs

Reviews

Start your review of Linear Structure of High-Level Concepts in Text-Controlled Generative Models

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.