Training a Neural Network - Backward Propagation and Gradient Descent
Valerio Velardo - The Sound of AI via YouTube
Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the theory and mathematics behind training neural networks through backpropagation and gradient descent in this 22-minute video. Delve into a high-level overview of the process, understand the roles of prediction and error wizards, examine the gradient of the error function, and analyze neural network elements. Learn about gradient descent and its application in optimizing neural networks. Access accompanying slides for visual aid and join The Sound of AI community for further discussions. Gain insights into hiring the presenter as a consultant or connect through various social media platforms for additional resources and networking opportunities.
Syllabus
Introduction
Highlevel overview
Prediction wizard
Error wizard
Gradient of error function
Neural network elements
Gradient descent
Taught by
Valerio Velardo - The Sound of AI