Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Introductory Lectures on First-Order Convex Optimization - Lecture 1

International Centre for Theoretical Sciences via YouTube

Overview

Dive into the fundamentals of first-order convex optimization in this comprehensive lecture by Praneeth Netrapalli. Explore gradient-based optimization techniques, including gradient descent and Nesterov's accelerated gradients algorithm. Examine the complexity of implementing oracles and optimization processes, and delve into key theorems, proofs, and lower bounds. Gain insights into smoothness concepts and estimate sequences. Analyze rearrangements, telescopic sums, and crucial observations to deepen your understanding of convex optimization principles. Perfect for advanced graduate students, postdocs, and researchers in theoretical physics and computer science seeking to enhance their knowledge of machine learning and statistical physics applications.

Syllabus

Introductory lectures on first-order convex optimization Lecture 1
Gradient based optimization
Complexity of implementing an oracle and Complexity of optimization given access to an oracle
Gradient Descent
Theorem
Remark
Proof
Rearrange and telescopic sum gives
Lower bounds: Theorem
Smoothness
Theorem
Proof
Nesterov's accelerated gradients algorithm
Estimate Sequences
Lemma
Proof
Observation
Compute

Taught by

International Centre for Theoretical Sciences

Reviews

Start your review of Introductory Lectures on First-Order Convex Optimization - Lecture 1

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.