Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore advanced concepts in multi-armed bandit algorithms with Kevin Jamieson from the University of Washington in this hour-long lecture. Delve into the latest research and applications of bandit theory, building upon foundational knowledge to understand more complex strategies for decision-making under uncertainty. Gain insights into cutting-edge techniques used in adaptive experimentation, online learning, and optimization across various domains.
Syllabus
Bandits II Kevin Jamieson University of Washington
Taught by
Paul G. Allen School