Explore the groundbreaking concept of Neural Ordinary Differential Equations in this informative video. Delve into a new family of deep neural network models that parameterize the derivative of the hidden state using a neural network, computed with a black-box differential equation solver. Discover the advantages of these continuous-depth models, including constant memory cost, adaptive evaluation strategies, and the ability to trade numerical precision for speed. Learn about their applications in continuous-depth residual networks and continuous-time latent variable models. Examine the innovative continuous normalizing flows, a generative model capable of training by maximum likelihood without data dimension partitioning or ordering. Understand the scalable backpropagation method through ODE solvers, enabling end-to-end training within larger models. Follow along as the video covers introduction, residual networks, advantages, evaluation, sequential data, experiments, and conclusion, providing a comprehensive overview of this cutting-edge machine learning technique.
Overview
Syllabus
Introduction
Residual Network
Advantages
Evaluation
Sequential Data
Experiments
Conclusion
Taught by
Yannic Kilcher