Introduction to Deep Learning - MIT 2018
Alexander Amini and Massachusetts Institute of Technology via YouTube
Overview
Syllabus
Intro
What is Deep Learning
Deep Learning Success: Vision
Deep Learning Success: Audio
Administrative Information
Final Class Project
Class Support
Course Staff
Why Deep Learning
The Perceptron: Forward Propagation
Common Activation Functions
Importance of Activation Functions
The Perceptron: Example
The Perceptron: Simplified
Multi Output Perceptron
Single Layer Neural Network
Deep Neural Network
Quantifying Loss
Empirical Loss
Binary Cross Entropy Loss
Mean Squared Error Loss
Loss Optimization
Computing Gradients: Backpropagation
Training Neural Networks is Difficult
Setting the Learning Rate
Adaptive Learning Rates
Adaptive Learning Rate Algorithms
Stochastic Gradient Descent
The Problem of Overfitting
Regularization 2: Early Stopping
Core Foundation Review
Taught by
https://www.youtube.com/@AAmini/videos