Machine Learning - Dynamical, Statistical, and Economic Perspectives
Society for Industrial and Applied Mathematics via YouTube
Overview
Explore the decision-making aspects of machine learning in this 54-minute conference talk by Michael I. Jordan from the University of California, Berkeley. Delve into statistical challenges associated with multiple decision-making and economic issues related to scarcity and competition in learning systems. Examine algorithmic problems using recent work on continuous-time dynamical systems perspectives for optimization and diffusion. Cover topics such as gradient descent, proof techniques, lower bounds, unconstrained convex optimization, Lagrangian optimization, symplectic integration, conservative systems, geometry, structure, and dissipative cases. Gain insights into open problems, theorems, and dynamical systems, concluding with a summary and Q&A session.
Syllabus
Intro
Realworld Machine Learning
Gradient Descent
Proof Technique
Lower Bounds
Unconstrained convex optimization
Lagrangian optimization
symplectic integration
conservative systems
geometry
structure
dissipative case
Open problems
Theorem
Dynamical Systems
Summary
QA
Taught by
Society for Industrial and Applied Mathematics