Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Introduction to Deep Learning - MIT 2018

Alexander Amini and Massachusetts Institute of Technology via YouTube

Overview

Explore the foundations of deep learning in this introductory lecture from MIT's 6.S191 course. Delve into key concepts including perceptrons, neural networks, activation functions, loss quantification, backpropagation, and optimization techniques. Learn about the challenges of training neural networks, adaptive learning rates, and strategies to prevent overfitting. Gain insights into deep learning's successes in vision and audio applications, and understand why this field has become so influential. Access additional lectures covering topics such as sequence modeling, computer vision, generative models, reinforcement learning, and industry perspectives from leading tech companies.

Syllabus

Intro
What is Deep Learning
Deep Learning Success: Vision
Deep Learning Success: Audio
Administrative Information
Final Class Project
Class Support
Course Staff
Why Deep Learning
The Perceptron: Forward Propagation
Common Activation Functions
Importance of Activation Functions
The Perceptron: Example
The Perceptron: Simplified
Multi Output Perceptron
Single Layer Neural Network
Deep Neural Network
Quantifying Loss
Empirical Loss
Binary Cross Entropy Loss
Mean Squared Error Loss
Loss Optimization
Computing Gradients: Backpropagation
Training Neural Networks is Difficult
Setting the Learning Rate
Adaptive Learning Rates
Adaptive Learning Rate Algorithms
Stochastic Gradient Descent
The Problem of Overfitting
Regularization 2: Early Stopping
Core Foundation Review

Taught by

https://www.youtube.com/@AAmini/videos

Reviews

Start your review of Introduction to Deep Learning - MIT 2018

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.