Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

CNCF [Cloud Native Computing Foundation]

FATE-LLM - Empowering Large Language Models with Federated Learning

CNCF [Cloud Native Computing Foundation] via YouTube

Overview

Explore the intersection of large language models (LLMs) and federated learning in this 28-minute conference talk by Fangchi Wang and Layne Peng from VMware. Discover how FATE-LLM, an open-source federated learning platform, addresses challenges in LLM development such as data scarcity and privacy concerns. Learn about the integration of popular LLMs like ChatGLM and LLaMA into the federated learning paradigm, and understand the technical considerations for efficiency and security. Gain insights into KubeFATE, a cloud-native solution for managing FATE on Kubernetes, and its role in accelerating FATE-LLM workflows. Examine real-world experiments, evaluations, and the future roadmap of this innovative approach to empowering large language models.

Syllabus

FATE-LLM: Empowering Large Language Models with Federated Learning - Fangchi Wang & Layne Peng

Taught by

CNCF [Cloud Native Computing Foundation]

Reviews

Start your review of FATE-LLM - Empowering Large Language Models with Federated Learning

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.