Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Nonnegative Polynomials, Nonconvex Polynomial Optimization, and Applications to Learning

Simons Institute via YouTube

Overview

Explore a comprehensive lecture on nonnegative polynomials, nonconvex polynomial optimization, and their applications in learning. Delve into shape-constrained regression, difference of convex (DC) programming, and monotone regression, including problem definitions and NP-hardness. Examine SOS relaxation techniques, approximation theorems, and numerical experiments in low noise environments. Investigate DC decomposition, the Convex-Concave Procedure (CCP), and strategies for selecting optimal decompositions. Gain insights into undominated decompositions and learn to compare different approaches. Understand the wide-ranging applications of optimization over nonnegative polynomials and the power of SDP/SOS-based relaxations in this field.

Syllabus

Intro
Optimizing over nonnegative polynomials
1. Shape-constrained regression
2. Difference of Convex (DC) programming Problems of the form min fo (x)
Monotone regression: problem definition
NP-hardness and SOS relaxation
Approximation theorem
Numerical experiments (1/2) • Low noise environment
Difference of Convex (dc) decomposition
Existence of dc decomposition (2/3)
Convex-Concave Procedure (CCP)
Picking the "best" decomposition for CCP
Undominated decompositions (1/2)
Comparing different decompositions (1/2)
Main messages • Optimization over nonnegative polynomials has many applications Powerful SDP/SOS-based relaxations available.
Uniqueness of dc decomposition

Taught by

Simons Institute

Reviews

Start your review of Nonnegative Polynomials, Nonconvex Polynomial Optimization, and Applications to Learning

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.