Completed
The Perceptron: Forward Propagation
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
MIT: Introduction to Deep Learning
Automatically move to the next video in the Classroom when playback concludes
- 1 Intro
- 2 The Rise of Deep Learning
- 3 What is Deep Learning?
- 4 Lecture Schedule
- 5 Final Class Project
- 6 Class Support
- 7 Course Staff
- 8 Why Deep Learning
- 9 The Perceptron: Forward Propagation
- 10 Common Activation Functions
- 11 Importance of Activation Functions
- 12 The Perceptron: Example
- 13 The Perceptron: Simplified
- 14 Multi Output Perceptron
- 15 Single Layer Neural Network
- 16 Deep Neural Network
- 17 Quantifying Loss
- 18 Empirical Loss
- 19 Binary Cross Entropy Loss
- 20 Mean Squared Error Loss
- 21 Loss Optimization
- 22 Computing Gradients: Backpropagation
- 23 Training Neural Networks is Difficult
- 24 Setting the Learning Rate
- 25 Adaptive Learning Rates
- 26 Adaptive Learning Rate Algorithms
- 27 Stochastic Gradient Descent
- 28 Mini-batches while training
- 29 The Problem of Overfitting
- 30 Regularization 1: Dropout
- 31 Regularization 2: Early Stopping
- 32 Core Foundation Review