Overview
Explore the integration of constraints into deep learning architectures through structured layers in this 50-minute lecture by Zico Kolter from CMU Bosch. Delve into topics such as deep equilibrium models, explicit and implicit layers, SAT optimization, and the implicit function theorem. Examine weight-tied input-injected networks, the expansion of depth to capture different layer types, and the concept of equilibrium points in deep networks. Gain insights into sequence modeling and smallscale benchmarks as part of the "Emerging Challenges in Deep Learning" series presented at the Simons Institute.
Syllabus
Introduction
Deep Equilibrium Models
Depth
Agenda
Explicit Layers
Implicit Layers
Sat Optimization
Implicit Function Theorem
Takeaway
Proposed Class of Structured Layers
Weight Tied Input Injected Networks
Expanding Depth to Capture Both Layers
Deep Networks
Summary
Stacking Layers
Do they exist
Equilibrium point
Residual point
Sequence modeling
Smallscale benchmarks
Taught by
Simons Institute