OpenFL - A Federated Learning Project to Power and Secure Your Projects
Toronto Machine Learning Series (TMLS) via YouTube
Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Discover the power of federated learning with OpenFL, a Python 3 framework designed for collaborative model training without compromising sensitive data. This 29-minute talk by Ezequiel Lanza, AI Open Source Evangelist at Intel, introduces OpenFL as a flexible, extensible, and easily learnable tool for data scientists. Explore how this community-supported project, originally developed by Intel Labs and the Intel Internet of Things Group, enables organizations to train models collaboratively while maintaining data privacy. Learn about OpenFL's narrow interfaces and its ability to run processes within Trusted Execution Environments (TEE), ensuring data and model confidentiality, computational integrity, and attestation of compute resources. Discover real-world applications, such as the collaboration between Intel Labs and UPenn, which utilized data from over 71 medical institutions to test federated learning for brain tumor edge detection. Gain insights into how federated learning hardware and software can secure sensitive data at the source while still benefiting from larger datasets. Understand how to adopt, contribute to, and secure federated learning projects using OpenFL, and join the community-driven effort to advance this powerful technology.
Syllabus
OpenFL - A Federated Learning Project to Power and Secure Your Projects
Taught by
Toronto Machine Learning Series (TMLS)