Suitability of Forward-Forward and PEPITA Learning to MLCommons-Tiny Benchmarks
EDGE AI FOUNDATION via YouTube
Overview
Watch a technical talk exploring how Forward-Forward and PEPITA learning algorithms compare to traditional backpropagation methods when applied to MLCommons-Tiny benchmarks. Dive into the challenges of on-device learning on tiny devices with limited memory and computational resources, presented by Danilo Pau, Technical Director at STMicroelectronics. Learn about "Forward-only algorithms" as biologically plausible alternatives that eliminate the need to store intermediate activations, potentially reducing power consumption from memory operations. Examine quantitative analysis showing how these approaches impact complexity and memory usage across different neural network architectures. Discover how Convolutional neural networks can achieve up to 40% memory reduction using Forward-Forward algorithms, though with some computational trade-offs, while Fully-connected networks show different optimization patterns. Understand the practical implications of implementing these techniques on micro-controllers and their suitability for various MLCommons-Tiny benchmark scenarios.
Syllabus
tinyML Talks: Suitability of Forward-Forward and PEPITA Learning to MLCommons-Tiny benchmarks
Taught by
EDGE AI FOUNDATION