Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Linux Foundation

Building Edge AI Stack and AI-as-a-Service in Cloud Native Way

Linux Foundation via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the architecture and implementation of edge AI stack and AI-as-a-Service in a cloud-native environment. Delve into the KubeEdge architecture, Akraino KubeEdge Edge Service Blueprint, and ML offloading functional block diagram. Examine use cases for device app ML model inference offloading workflows and address edge AI challenges. Learn about the KubeEdge-AIService architecture and edge-cloud collaborative techniques such as joint inference, incremental learning, and federated learning. Gain insights into developer perspectives on joint inference and federated learning, as well as resource information for building robust edge AI solutions.

Syllabus

Intro
Problem and Challenges
Background
KubeEdge Architecture
Akraino KubeEdge Edge Service Blueprint
KubeEdge ML Offloading Functional Block Diagram
Use Case: Device App ML model inference offloading workflow
Edge Al Challenges
KubeEdge-AI
Service Architecture
Edge-cloud Collaborative JOINT INFERENCE Improve the inference performance, when edge resources are imited
Edge-cloud Collaborative INCREMENTAL LEARNING The more models are used the smarter they are
Edge-cloud collaborative FEDERATED LEARNING Raw data is not transmitted out of the edge, and the model is generated by
Developer perspective: JOINT INFERENCE
Developer perspective: FEDERATED LEARNING
Resource Information

Taught by

Linux Foundation

Reviews

Start your review of Building Edge AI Stack and AI-as-a-Service in Cloud Native Way

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.