Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Leveraging Sparsity for Fast Response Times in Edge AI - tinyML Summit 2021

tinyML via YouTube

Overview

Explore a 23-minute conference talk from the tinyML Summit 2021 Partner Session on Edge Applications, focusing on leveraging sparsity for fast response times in edge computing. Delve into the concept of NeuronFlow, a novel multi-core processor architecture that exploits various forms of sparsity to create a scalable dataflow processing engine for AI applications at the edge. Learn about the significance of low latency in Edge AI applications, metrics for measuring latency, and their correlation to application performance. Discover how NeuronFlow's unique sparsity-exploitation characteristics enable real-time live AI applications where rapid response times are crucial. Gain insights into event-based self-time scheduling, single batch processing, resource allocation, and synchronization in the context of the Neural Flow Architecture. Examine the concept of sparsity through practical examples, comparing frames per second (FPS) to delta frames, and get an overview of the GrayOne chip. This presentation by Orlando Moreira, Fellow and Chief Architect at GrAI Matter Labs, offers valuable knowledge for professionals interested in cutting-edge AI technologies for edge computing.

Syllabus

Introduction
About GreyMetal Labs
About the architecture
About latency
Response latency
Influence latency
Eventbased selftime scheduling
Single batch processing
Resource allocation
Synchronization
Neural Flow Architecture
Eventbased Execution
What is sparsity
Example
FPS vs Delta Frames
GrayOne chip
Summary
Sponsors

Taught by

tinyML

Reviews

Start your review of Leveraging Sparsity for Fast Response Times in Edge AI - tinyML Summit 2021

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.