Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Heterogeneity-Aware Algorithms for Federated Learning and Distributed Optimization

Centre for Networked Intelligence, IISc via YouTube

Overview

Learn about federated learning and distributed optimization in this technical lecture from Carnegie Mellon University Associate Professor Gauri Joshi. Explore the challenges and solutions for implementing machine learning at the edge, where data collection and model training occur on resource-constrained mobile devices. Dive into the complexities of heterogeneity in federated learning systems, examining how variations in data, communication, and computation across edge clients impact system performance. Discover recent algorithmic developments designed to address these heterogeneity challenges, including approaches to client selection and local adaptive optimization. Follow along as Prof. Joshi, an MIT Technology Review 35 Innovators under 35 recipient and NSF CAREER Award winner, breaks down key concepts from stochastic gradient descent to next word prediction applications, while addressing critical questions about scalability and flexibility in federated optimization systems.

Syllabus

Introduction
Stochastic Gradient Descent SGD
Application Next Word Prediction
Federated Learning
Local Objective Functions
Basic Algorithm
Sources of Heterogeneity
Why is this a problem
Quantifying Heterogeneity
Open Question 1
Open Question 2
Communication heterogeneity
Client selection
Example
Power of Choice Selection
Summary
Questions
Local Adaptive Optimization
Key takeaway
Other interesting directions

Taught by

Centre for Networked Intelligence, IISc

Reviews

Start your review of Heterogeneity-Aware Algorithms for Federated Learning and Distributed Optimization

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.