Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Continuous-in-time Limit for Bandits

USC Probability and Statistics Seminar via YouTube

Overview

Explore the connection between Hamilton-Jacobi-Bellman equations and multi-armed bandit (MAB) problems in this 44-minute seminar talk from the USC Probability and Statistics Seminar series. Delve into the first work establishing this connection in a general setting, as presented by Yuhua Zhu from UCSD. Learn about an efficient algorithm for solving MAB problems based on this newly established link and discover its practical applications. Gain insights into the exploration-exploitation trade-off in sequential decision making under uncertainty, a key concept in MAB paradigms.

Syllabus

Yuhua Zhu: Continuous-in-time Limit for Bandits (UCSD)

Taught by

USC Probability and Statistics Seminar

Reviews

Start your review of Continuous-in-time Limit for Bandits

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.