Loss Function and Gradient Descent in Neural Network Training - Part 3.1

Loss Function and Gradient Descent in Neural Network Training - Part 3.1

Donato Capitella via YouTube Direct link

- Training a Network by Minimizing the Loss

7 of 10

7 of 10

- Training a Network by Minimizing the Loss

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Loss Function and Gradient Descent in Neural Network Training - Part 3.1

Automatically move to the next video in the Classroom when playback concludes

  1. 1 - Introduction
  2. 2 - Preparing Training Datasets
  3. 3 - Regression and Classification Labels One-Hot Encoding
  4. 4 - Data Normalization
  5. 5 - Dataset Split Training, Validation, Testing
  6. 6 - Loss Functions L1, L2, Cross-Entropy
  7. 7 - Training a Network by Minimizing the Loss
  8. 8 - Gradient Descent
  9. 9 - Backpropagation and Gradient Vector
  10. 10 - Next Video...

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.