Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a comprehensive seminar on stochastic gradient descent methods using biased estimators, presented by Quoc Tran-Dinh from the University of North Carolina at Chapel Hill. Delve into recent advancements in gradient descent algorithms, their variants, and practical applications in machine learning. Gain insights into the speaker's research on stochastic gradient-based methods for large-scale optimization and minimax problems, with potential applications in deep learning, statistical learning, generative adversarial nets, and federated learning. Learn about the collaborative work with researchers from UNC and IBM, and understand the theoretical and practical aspects of these cutting-edge optimization techniques.
Syllabus
[Seminar Series] Stochastic Gradient Descent Methods with Biased Estimators
Taught by
VinAI