Overview
Explore federated learning, a revolutionary technology for training machine learning models across distributed devices while preserving privacy. Discover how this approach allows models to learn from decentralized data without sensitive information leaving users' devices. Gain insights into Google's production deployment of federated learning and learn how TensorFlow Federated enables researchers to simulate this technique on their own datasets. Delve into the federated learning workflow, comparing it to traditional distributed learning, and understand its applications in language modeling and handling new words. This 41-minute conference talk from Google I/O'19, presented by Daniel Ramage and Emily Glanz, covers key concepts such as secure aggregation, federated computation, and the distinctions between federated and decentralized computing.
Syllabus
Intro
Agenda
Decentralized Data
Federated Computation
Federated Computation vs Decentralized Computation
Secure Aggregation
Federated Learning Workflow
Federated Learning vs Traditional Distributed Learning
Language Modeling
New Words
Dont Memorize
Taught by
TensorFlow