Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Liquid Neural Networks

MITCBMM via YouTube

Overview

Explore the innovative world of Liquid Time-Constant (LTC) Networks in this 50-minute talk by MIT researchers Ramin Hasani and Daniela Rus. Delve into the mechanics of these continuous-time neural network models, which construct networks of linear first-order dynamical systems modulated by nonlinear interlinked gates. Learn how LTCs represent dynamical systems with varying time-constants, computed using numerical differential equation solvers. Discover the advantages of LTCs, including their stable and bounded behavior, superior expressivity within neural ordinary differential equations, and improved performance in time-series prediction tasks compared to advanced recurrent network models. Follow the presentation through topics such as neural dynamics, continuous time networks, implementation, dynamic causal models, behavioral cloning, and limitations. Gain insights from Dr. Daniela Rus, a distinguished professor and director at MIT CSAIL, and Dr. Ramin Hasani, a postdoctoral associate and machine learning scientist, as they share their expertise in robotics, mobile computing, and interpretable deep learning algorithms.

Syllabus

Introduction
Presentation
Liquid Neural Networks
Neural Dynamics
Continuous Time Networks
Implementation
Dynamic Causal Model
Liquid Neural Network
Behavioral Cloning
Limitations
Summary

Taught by

MITCBMM

Reviews

Start your review of Liquid Neural Networks

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.