MIT: Introduction to Deep Learning
Alexander Amini and Massachusetts Institute of Technology via YouTube
Overview
Syllabus
Intro
The Rise of Deep Learning
What is Deep Learning?
Lecture Schedule
Final Class Project
Class Support
Course Staff
Why Deep Learning
The Perceptron: Forward Propagation
Common Activation Functions
Importance of Activation Functions
The Perceptron: Example
The Perceptron: Simplified
Multi Output Perceptron
Single Layer Neural Network
Deep Neural Network
Quantifying Loss
Empirical Loss
Binary Cross Entropy Loss
Mean Squared Error Loss
Loss Optimization
Computing Gradients: Backpropagation
Training Neural Networks is Difficult
Setting the Learning Rate
Adaptive Learning Rates
Adaptive Learning Rate Algorithms
Stochastic Gradient Descent
Mini-batches while training
The Problem of Overfitting
Regularization 1: Dropout
Regularization 2: Early Stopping
Core Foundation Review
Taught by
https://www.youtube.com/@AAmini/videos