Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Learn about the inner workings of Recurrent Neural Networks (RNNs) in this comprehensive 29-minute video tutorial. Explore simple RNN units, time series analysis, and the intricacies of Back Propagation Through Time (BPTT). Dive into topics such as univariate and multivariate time series, RNN architecture, unrolling recurrent layers, and various RNN configurations including sequence-to-vector and sequence-to-sequence. Understand the role of memory cells in simple RNNs, the significance of the tanh activation function, and the mathematical foundations of BPTT. Gain insights into the challenges associated with simple RNNs and prepare for advanced concepts in neural network design.
Syllabus
Intro
Univariate time series
Multivariate time series
Intuition
RNN architecture
Unrolling a recurrent layer
Data shape
Sequence to vector RNN
Sequence to sequence RNN
Memory cell for simple RNN
Why do we use tanh?
Backpropagation through time (BPTT)
The math behind
Issues with simple RNNS
What's up next?
Taught by
Valerio Velardo - The Sound of AI