Robust Learning of a Single Neuron - Bridging Computational Gaps Using Optimization
Institute for Pure & Applied Mathematics (IPAM) via YouTube
Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a comprehensive lecture on robust learning of a single neuron, focusing on bridging computational gaps using optimization insights. Delve into recent findings on learning a neuron with ReLU and other activation functions in an agnostic setting, aiming for near-optimal mean square loss. Examine the key role of a surrogate stochastic convex optimization problem in achieving low sample and computational complexity while maintaining target error guarantees. Investigate local error bounds from optimization theory established under mild distributional assumptions, covering sub-exponential, heavy-tailed, and some discrete distributions. Discover the surprising independence of the error bound constant from problem dimension and its crucial impact on the results. Analyze generalizations to other activation functions, including the challenging case of unknown activation functions. Gain valuable insights into computational vs statistical gaps in learning and optimization through this in-depth presentation by Jelena Diakonikolas from the University of Wisconsin-Madison at IPAM's EnCORE Workshop.
Syllabus
Jelena Diakonikolas - Robust Learning of a Neuron: Bridging Computational Gaps Using Optimization
Taught by
Institute for Pure & Applied Mathematics (IPAM)