Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

MIT: Introduction to Deep Learning

Alexander Amini and Massachusetts Institute of Technology via YouTube

Overview

Explore the foundations of deep learning in this introductory lecture from MIT's 6.S191 course. Delve into the rise of deep learning, understand its significance, and learn about perceptrons, neural networks, and activation functions. Discover how to quantify loss, optimize algorithms, and tackle challenges in training neural networks. Gain insights into adaptive learning rates, stochastic gradient descent, and strategies to prevent overfitting. Master the core concepts of deep learning to build a strong foundation for advanced applications in artificial intelligence.

Syllabus

Intro
The Rise of Deep Learning
What is Deep Learning?
Lecture Schedule
Final Class Project
Class Support
Course Staff
Why Deep Learning
The Perceptron: Forward Propagation
Common Activation Functions
Importance of Activation Functions
The Perceptron: Example
The Perceptron: Simplified
Multi Output Perceptron
Single Layer Neural Network
Deep Neural Network
Quantifying Loss
Empirical Loss
Binary Cross Entropy Loss
Mean Squared Error Loss
Loss Optimization
Computing Gradients: Backpropagation
Training Neural Networks is Difficult
Setting the Learning Rate
Adaptive Learning Rates
Adaptive Learning Rate Algorithms
Stochastic Gradient Descent
Mini-batches while training
The Problem of Overfitting
Regularization 1: Dropout
Regularization 2: Early Stopping
Core Foundation Review

Taught by

https://www.youtube.com/@AAmini/videos

Reviews

Start your review of MIT: Introduction to Deep Learning

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.