Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Loss Function and Gradient Descent in Neural Network Training - Part 3.1

Donato Capitella via YouTube

Overview

A 10-minute educational video explores the fundamentals of preparing datasets for neural network training, focusing on loss function optimization and gradient descent techniques. Learn about essential data preparation steps including regression analysis, classification labels with one-hot encoding, and data normalization methods. Discover how to properly split datasets into training, validation, and testing segments, understand various loss functions (L1, L2, Cross-Entropy), and master the principles of network training through loss minimization. Explore the mechanics of gradient descent, backpropagation, and gradient vectors, complete with downloadable mindmaps and comprehensive references to PyTorch loss functions and advanced neural network concepts.

Syllabus

- Introduction
- Preparing Training Datasets
- Regression and Classification Labels One-Hot Encoding
- Data Normalization
- Dataset Split Training, Validation, Testing
- Loss Functions L1, L2, Cross-Entropy
- Training a Network by Minimizing the Loss
- Gradient Descent
- Backpropagation and Gradient Vector
- Next Video...

Taught by

Donato Capitella

Reviews

Start your review of Loss Function and Gradient Descent in Neural Network Training - Part 3.1

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.