Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Recurrent Neural Networks and Transformers

Alexander Amini and Massachusetts Institute of Technology via YouTube

Overview

Explore the fundamentals of recurrent neural networks and transformers in this comprehensive lecture from MIT's Introduction to Deep Learning course. Delve into sequence modeling, neurons with recurrence, and the intuition behind RNNs. Learn how to unfold RNNs, build them from scratch, and understand their design criteria for sequential modeling. Examine word prediction examples and backpropagation through time, while addressing gradient issues. Discover long short-term memory (LSTM) and various RNN applications. Investigate attention mechanisms, their intuition, and relationship to search. Gain insights into learning attention with neural networks, scaling attention, and its applications. Conclude with a summary of key concepts in this 58-minute lecture delivered by Ava Soleimany, offering a solid foundation in advanced deep learning techniques.

Syllabus

​ - Introduction
​ - Sequence modeling
​ - Neurons with recurrence
- Recurrent neural networks
​ - RNN intuition
​ - Unfolding RNNs
- RNNs from scratch
- Design criteria for sequential modeling
- Word prediction example
​ - Backpropagation through time
- Gradient issues
​ - Long short term memory LSTM
​ - RNN applications
- Attention fundamentals
- Intuition of attention
- Attention and search relationship
- Learning attention with neural networks
- Scaling attention and applications
- Summary

Taught by

https://www.youtube.com/@AAmini/videos

Reviews

Start your review of Recurrent Neural Networks and Transformers

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.