Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Can Transformers Reason Logically? A Study in SAT-Solving

Harvard CMSA via YouTube

Overview

Explore a comprehensive mathematics seminar presentation from Georgia Tech's Leyan Pan examining the logical reasoning capabilities of Transformer-based Large Language Models (LLMs) through the lens of SAT-solving. Delve into the investigation of Boolean reasoning abilities in decoder-only Transformers using Chain-of-Thought methodology, specifically focusing on their capacity to decide 3-SAT instances within bounded parameters. Learn about the formal expressiveness of Transformer models, understand the equivalence between Chain-of-Thought reasoning and the DPLL SAT-solving algorithm, and discover how 3-SAT formulas and partial assignments can be encoded as vectors for implementation through attention mechanisms. Examine experimental results supporting theoretical predictions while considering the limitations of standard Transformers in solving unbounded length reasoning problems and potential solutions to overcome these constraints.

Syllabus

Leyan Pan | Can Transformers Reason Logically? A Study in SAT-Solving

Taught by

Harvard CMSA

Reviews

Start your review of Can Transformers Reason Logically? A Study in SAT-Solving

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.