Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Robust Learning of a Single Neuron - Bridging Computational Gaps Using Optimization

Institute for Pure & Applied Mathematics (IPAM) via YouTube

Overview

Explore a comprehensive lecture on robust learning of a single neuron, focusing on bridging computational gaps using optimization insights. Delve into recent findings on learning a neuron with ReLU and other activation functions in an agnostic setting, aiming for near-optimal mean square loss. Examine the key role of a surrogate stochastic convex optimization problem in achieving low sample and computational complexity while maintaining target error guarantees. Investigate local error bounds from optimization theory established under mild distributional assumptions, covering sub-exponential, heavy-tailed, and some discrete distributions. Discover the surprising independence of the error bound constant from problem dimension and its crucial impact on the results. Analyze generalizations to other activation functions, including the challenging case of unknown activation functions. Gain valuable insights into computational vs statistical gaps in learning and optimization through this in-depth presentation by Jelena Diakonikolas from the University of Wisconsin-Madison at IPAM's EnCORE Workshop.

Syllabus

Jelena Diakonikolas - Robust Learning of a Neuron: Bridging Computational Gaps Using Optimization

Taught by

Institute for Pure & Applied Mathematics (IPAM)

Reviews

Start your review of Robust Learning of a Single Neuron - Bridging Computational Gaps Using Optimization

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.