Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Parameter Sharing - Recurrent and Convolutional Nets

Alfredo Canziani via YouTube

Overview

Explore parameter sharing in recurrent and convolutional neural networks in this comprehensive 2-hour lecture by Yann LeCun. Delve into hypernetworks, shared weights, and gradient addition in parameter sharing. Examine recurrent nets, including unrolling in time, vanishing and exploding gradients, and RNN tricks. Investigate memory concepts, LSTM networks, and attention mechanisms for sequence-to-sequence mapping. Study convolutional nets, including motif detection, convolution definitions, backpropagation, and architecture. Learn about vintage ConvNets, brain image interpretation, and the Hubel & Wiesel model of the visual cortex. Gain insights into ConvNet invariance and equivariance, training time, iteration cycles, and historical remarks in deep learning.

Syllabus

– Welcome to class
– Hypernetworks
– Shared weights
– Parameter sharing ⇒ adding the gradients
– Max and sum reductions
– Recurrent nets
– Unrolling in time
– Vanishing and exploding gradients
– Math on the whiteboard
– RNN tricks
– RNN for differential equations
– GRU
– What is a memory
– LSTM – Long Short-Term Memory net
– Multilayer LSTM
– Attention for sequence to sequence mapping
– Convolutional nets
– Detecting motifs in images
– Convolution definitions
– Backprop through convolutions
– Stride and skip: subsampling and convolution “à trous”
– Convolutional net architecture
– Multiple convolutions
– Vintage ConvNets
– How does the brain interpret images?
– Hubel & Wiesel's model of the visual cortex
– Invariance and equivariance of ConvNets
– In the next episode…
– Training time, iteration cycle, and historical remarks

Taught by

Alfredo Canziani

Reviews

Start your review of Parameter Sharing - Recurrent and Convolutional Nets

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.