Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

XuetangX

现代优化方法(英)

Beijing Institute of Technology via XuetangX

Overview

      该课程面向研究生,学生通过课程学习,可以了解前沿优化基础理论及如何用所学理论进行实际问题求解。本课程主要介绍连续最优化的基本理论及算法,包括约束优化、无约束优化的最优性条件及经典算法。另涵盖半光滑牛顿法及经典优化算法在前沿科研问题中的应用。通过课程学习,可以熟悉优化的基本算法设计及如何用优化方法解决实际问题。本课程有如下特色:

     1.案例式教学,引入前沿科研案例,介绍优化算法在前沿科研中的应用,并让学习者看到立体化知识点。2.研究型课程:在案例式教学基础上,将个人研究所得与教学进行融合,打破教学与科研壁垒,实现教学与科研相互促进,形成良性循环。3.激发学以致用的潜力,使得知识点在需求牵引拉动下呈现,对知识点的认知从起源到发展到应用形成闭环。4.图形化展示使得知识点的理解变得简单易于掌握。

      本课程的考试结课前两周内发布。

Syllabus

  • Chapter 1. Introduction
    • 1.1 About optimization
    • 1.2 Classification of optimization
    • 1.3 Preliminaries in convex analysis
  • Chapter 2. Fundamentals of Unconstrained Optimization
    • 2.1 What is a Solution?
    • 2.2 Optimality Conditions Ⅰ
    • 2.3 Optimality Conditions Ⅱ
    • 2.4 Line search strategy
    • 2.5 Search direction Ⅱ
    • 2.6 Convergence
  • Chapter 3. Line Search Methods
    • 3.1 Exact Step Length
    • 3.2 The Wolfe conditions
    • 3.3 Inexact Line Search II
    • 3.4 Convergence of Line Search Methods
    • 3.5 Convergence Rate
  • Chapter 4. Trust Region Methods
    • 4.1 Main Idea of Trust Region Methods
    • 4.2 Trust-Region Algorithm
    • 4.3 Solving Subproblem
    • 4.4 Solving Subproblem II
    • 4.5 Convergence
  • Chapter 5. Conjugate Gradient Methods
    • 5.1 Conjugate direction method
    • 5.2 Property of conjugate direction method
    • 5.3 Conjugate gradient method
    • 5.4 Rate of convergence
    • 5.5 Nonlinear conjugate gradient method
    • 5.6 Convergence of nonlinear conjugate gradient method
  • Chapter 6. Semismooth Newton's Method
    • 6.1 Semismoothness
    • 6.2 Semismooth Newton's Method
    • 6.3 Support Vector Machine
    • 6.4 Semismooth Newtons' Method for SVM
    • 6.5 Exploring Sparsity in SVM
  • Chapter 7. Theory of Constrained Optimization
    • 7.1 Local and Global Solutions
    • 7.2 Examples One
    • 7.3 Examples Two and Three
    • 7.4 Constraint Qualifications
    • 7.5 First-Order Optimality Conditions
    • 7.6 Second Order Necessary Condition
    • 7.7 Second Order Sufficient Condition
    • 7.8 Duality
  • Chapter 8. Further Discussions on Constrained Optimization
    • 8.1 KKT conditions
    • 8.2 An Example
    • 8.3 Dual Problem
  • Chapter 9. Penalty and Augmented Lagrangian Methods
    • 9.1 Quadratic Penalty Method
    • 9.2 Exact Penalty Function
    • 9.3 Augmented Lagrangian Method
    • 9.4 Quadratic Penalty Method for Hypergraph Matching
    • 9.5 Quadratic Penalty Method for Hypergraph Matching: Ⅱ
    • 9.6 Augmented Lagrangian Method for SVM
    • 9.7 Augmented Lagrangian Method for SVM: Ⅱ
  • 期末考试

    Taught by

    LI QINGNA

    Tags

    Reviews

    Start your review of 现代优化方法(英)

    Never Stop Learning.

    Get personalized course recommendations, track subjects and courses with reminders, and more.

    Someone learning on their laptop while sitting on the floor.