Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Gradients Are Not All You Need - Machine Learning Research Paper Explained

Yannic Kilcher via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the limitations of differentiable programming techniques in machine learning through this in-depth video analysis. Delve into the chaos-based failure mode that affects various differentiable systems, from recurrent neural networks to numerical physics simulations. Examine the connection between this failure and the spectrum of the Jacobian, and learn criteria for predicting when differentiation-based optimization algorithms might falter. Investigate examples in policy learning, meta-learning optimizers, and disk packing to understand the practical implications. Discover potential solutions and consider the advantages of black-box methods in overcoming these challenges.

Syllabus

- Foreword
- Intro & Overview
- Backpropagation through iterated systems
- Connection to the spectrum of the Jacobian
- The Reparameterization Trick
- Problems of reparameterization
- Example 1: Policy Learning in Simulation
- Example 2: Meta-Learning Optimizers
- Example 3: Disk packing
- Analysis of Jacobians
- What can be done?
- Just use Black-Box methods

Taught by

Yannic Kilcher

Reviews

Start your review of Gradients Are Not All You Need - Machine Learning Research Paper Explained

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.