Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Gradient and Hessian Approximations for Model-based Blackbox Optimization

GERAD Research Center via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore gradient and Hessian approximations for model-based blackbox optimization in this 48-minute seminar from GERAD Research Center. Delve into the mathematical theory behind optimizing functions that provide output without explanation. Examine classical and novel approximation techniques for blackbox functions, and see their application in a Medical Physics case study. Learn about solid state tank design optimization, Order-N accuracy, Newton's Method, and various gradient models. Investigate generalized simplex gradients, pseudo inverses, error bounds, and centered simplex gradients. Discover adjusted gradient techniques, simplex Hessians, and potential future research directions in this comprehensive talk by Warren Hare from the University of British Columbia.

Syllabus

Gradient and Hessian Approximations for Model-based Blackbox Optimization
Solid state tank design
Optimizing the design
Order-N accuracy at x
Newton's Method
Proof
Models from gradients
A cleaner approach
Generalizing the Simplex Gradient
Pseudo inverses
Generalized Simplex Gradient error bound
Centred Simplex Gradients
Adjusted generalized centred simplex gradient
Adjusted Centred Simplex Gradient
A simpler approach
Generalized Simplex Hessian
Summary
Open directions

Taught by

GERAD Research Center

Reviews

Start your review of Gradient and Hessian Approximations for Model-based Blackbox Optimization

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.