Loss Function and Gradient Descent in Neural Network Training - Part 3.1

Loss Function and Gradient Descent in Neural Network Training - Part 3.1

Donato Capitella via YouTube Direct link

- Regression and Classification Labels One-Hot Encoding

3 of 10

3 of 10

- Regression and Classification Labels One-Hot Encoding

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Loss Function and Gradient Descent in Neural Network Training - Part 3.1

Automatically move to the next video in the Classroom when playback concludes

  1. 1 - Introduction
  2. 2 - Preparing Training Datasets
  3. 3 - Regression and Classification Labels One-Hot Encoding
  4. 4 - Data Normalization
  5. 5 - Dataset Split Training, Validation, Testing
  6. 6 - Loss Functions L1, L2, Cross-Entropy
  7. 7 - Training a Network by Minimizing the Loss
  8. 8 - Gradient Descent
  9. 9 - Backpropagation and Gradient Vector
  10. 10 - Next Video...

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.