Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the innovative world of Liquid Time-Constant (LTC) Networks in this 50-minute talk by MIT researchers Ramin Hasani and Daniela Rus. Delve into the mechanics of these continuous-time neural network models, which construct networks of linear first-order dynamical systems modulated by nonlinear interlinked gates. Learn how LTCs represent dynamical systems with varying time-constants, computed using numerical differential equation solvers. Discover the advantages of LTCs, including their stable and bounded behavior, superior expressivity within neural ordinary differential equations, and improved performance in time-series prediction tasks compared to advanced recurrent network models. Follow the presentation through topics such as neural dynamics, continuous time networks, implementation, dynamic causal models, behavioral cloning, and limitations. Gain insights from Dr. Daniela Rus, a distinguished professor and director at MIT CSAIL, and Dr. Ramin Hasani, a postdoctoral associate and machine learning scientist, as they share their expertise in robotics, mobile computing, and interpretable deep learning algorithms.
Syllabus
Introduction
Presentation
Liquid Neural Networks
Neural Dynamics
Continuous Time Networks
Implementation
Dynamic Causal Model
Liquid Neural Network
Behavioral Cloning
Limitations
Summary
Taught by
MITCBMM