Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Deep Learning Fundamentals - Full Stack Deep Learning

The Full Stack via YouTube

Overview

Dive into the fundamentals of deep learning in this 30-minute lecture from the Full Stack Deep Learning Spring 2021 course. Explore artificial neural networks, the universal approximation theorem, and three major types of learning problems. Understand the empirical risk minimization problem, grasp the concept behind gradient descent, and learn about back-propagation in practice. Examine core neural architectures and the rise of GPUs in deep learning. Cover topics including neural networks, universality, learning problems, loss functions, gradient descent, backpropagation, automatic differentiation, architectural considerations, and CUDA cores. For those needing a refresher, consult the recommended online book at neuralnetworksanddeeplearning.com before watching.

Syllabus

- Intro
​ - Neural Networks
​ - Universality
​ - Learning Problems
​ - Empirical Risk Minimization / Loss Functions
​ - Gradient Descent
​ - Backpropagation / Automatic Differentiation
​ - Architectural Considerations
​ - CUDA / Cores of Compute

Taught by

The Full Stack

Reviews

Start your review of Deep Learning Fundamentals - Full Stack Deep Learning

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.