Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

XuetangX

Modern Optimization Methods

Beijing Institute of Technology via XuetangX

Overview

This course mainly focuses on the basic definitions and theory about numerical optimization, including optimality conditions for unconstrained and constrained optimization, as well as algorithms for unconstrained and constrained problems. Moreover, it also includes the semismooth Newton’s method, which plays an important role in large-scale numerical optimization. Finally, several applications about optimization will also be introduced, including optimization algorithms for hypergraph matching and support vector machine.

Syllabus

  • Chapter 1. Introduction
    • 1.1 About optimization
    • 1.2 Classification of optimization
    • 1.3 Preliminaries in convex analysis
  • Chapter 2. Fundamentals of Unconstrained Optimization
    • 2.1 What is a Solution?
    • 2.2 Optimality Conditions â… 
    • 2.3 Optimality Conditions â…¡
    • 2.4 Line search strategy
    • 2.5 Search direction â…¡
    • 2.6 Convergence
  • Chapter 3. Line Search Methods
    • 3.1 Exact Step Length
    • 3.2 The Wolfe conditions
    • 3.3 Inexact Line Search II
    • 3.4 Convergence of Line Search Methods
    • 3.5 Convergence Rate
  • Chapter 4. Trust Region Methods
    • 4.1 Main Idea of Trust Region Methods
    • 4.2 Trust-Region Algorithm
    • 4.3 Solving Subproblem
    • 4.4 Solving Subproblem II
    • 4.5 Convergence
  • Chapter 5. Conjugate Gradient Methods
    • 5.1 Conjugate direction method
    • 5.2 Property of conjugate direction method
    • 5.3 Conjugate gradient method
    • 5.4 Rate of convergence
    • 5.5 Nonlinear conjugate gradient method
    • 5.6 Convergence of nonlinear conjugate gradient method
  • Chapter 6. Semismooth Newton's Method
    • 6.1 Semismoothness
    • 6.2 Semismooth Newton's Method
    • 6.3 Support Vector Machine
    • 6.4 Semismooth Newtons' Method for SVM
    • 6.5 Exploring Sparsity in SVM
  • Chapter 7. Theory of Constrained Optimization
    • 7.1 Local and Global Solutions
    • 7.2 Examples One
    • 7.3 Examples Two and Three
    • 7.4 Constraint Qualifications
    • 7.5 First-Order Optimality Conditions
    • 7.6 Second Order Necessary Condition
    • 7.7 Second Order Sufficient Condition
    • 7.8 Duality
  • Chapter 8. Further Discussions on Constrained Optimization
    • 8.1 KKT conditions
    • 8.2 An Example
    • 8.3 Dual Problem
  • Chapter 9. Penalty and Augmented Lagrangian Methods
    • 9.1 Quadratic Penalty Method
    • 9.2 Exact Penalty Function
    • 9.3 Augmented Lagrangian Method
    • 9.4 Quadratic Penalty Method for Hypergraph Matching
    • 9.5 Quadratic Penalty Method for Hypergraph Matching: â…¡
    • 9.6 Augmented Lagrangian Method for SVM
    • 9.7 Augmented Lagrangian Method for SVM: â…¡
  • 期末考试

    Taught by

    LI QINGNA

    Tags

    Reviews

    Start your review of Modern Optimization Methods

    Never Stop Learning.

    Get personalized course recommendations, track subjects and courses with reminders, and more.

    Someone learning on their laptop while sitting on the floor.