Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Long Short-Term Memory with PyTorch + Lightning

StatQuest with Josh Starmer via YouTube

Overview

Learn how to implement and train Long Short-Term Memory (LSTM) networks using PyTorch and Lightning in this comprehensive 33-minute tutorial. Code an LSTM unit from scratch, then utilize PyTorch's nn.LSTM() function for comparison. Discover Lightning's powerful features, including adding training epochs without restarting and easily visualizing training results. Explore key concepts such as importing modules, creating LSTM classes, initializing tensors, performing LSTM calculations, configuring optimizers, and calculating loss. Gain hands-on experience in training both custom-built and PyTorch-provided LSTM models, and learn to evaluate training progress using TensorBoard. Perfect for those looking to deepen their understanding of LSTM implementation and training techniques in PyTorch and Lightning.

Syllabus

Awesome song and introduction
Importing the modules
An outline of an LSTM class
init: Creating and initializing the tensors
lstm_unit: Doing the LSTM math
forward: Make a forward pass through an unrolled LSTM
configure_optimizers: Configure the...optimizers.
training_step: Calculate the loss and log progress
Using and training our homemade LSTM
Evaluating training with TensorBoard
Adding more epochs to training
Using and training PyTorch's nn.lstm

Taught by

StatQuest with Josh Starmer

Reviews

Start your review of Long Short-Term Memory with PyTorch + Lightning

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.