Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Learning Paradigms for Neural Networks: The Locally Backpropagated Forward-Forward Algorithm

Inside Livermore Lab via YouTube

Overview

Explore a cutting-edge approach to neural network training in this 57-minute talk by Fabio Giampaolo from the University of Naples Federico II. Delve into the Locally Backpropagated Forward Forward training strategy, a novel method combining the effectiveness of backpropagation with the appealing attributes of the Forward-Forward algorithm. Understand how this innovative technique addresses limitations of traditional methods, particularly in integrating Deep Learning strategies within complex frameworks dealing with physics-related problems. Learn about challenges such as incorporating non-differentiable components in neural architectures and implementing distributed learning on heterogeneous devices. Gain insights into the potential of this approach to broaden the applicability of AI strategies in real-world situations, especially in contexts where conventional methods face limitations.

Syllabus

DDPS | Learning paradigms for neural networks: The locally backpropagated forward-forward algorithm

Taught by

Inside Livermore Lab

Reviews

Start your review of Learning Paradigms for Neural Networks: The Locally Backpropagated Forward-Forward Algorithm

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.