Completed
01 – History and resources
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
NYU Deep Learning
Automatically move to the next video in the Classroom when playback concludes
- 1 01 – History and resources
- 2 01L – Gradient descent and the backpropagation algorithm
- 3 02 – Neural nets: rotation and squashing
- 4 02L – Modules and architectures
- 5 03 – Tools, classification with neural nets, PyTorch implementation
- 6 03L – Parameter sharing: recurrent and convolutional nets
- 7 04L – ConvNet in practice
- 8 04.1 – Natural signals properties and the convolution
- 9 04.2 – Recurrent neural networks, vanilla and gated (LSTM)
- 10 05L – Joint embedding method and latent variable energy based models (LV-EBMs)
- 11 05.1 – Latent Variable Energy Based Models (LV-EBMs), inference
- 12 05.2 – But what are these EBMs used for?
- 13 06L – Latent variable EBMs for structured prediction
- 14 06 – Latent Variable Energy Based Models (LV-EBMs), training
- 15 07L – PCA, AE, K-means, Gaussian mixture model, sparse coding, and intuitive VAE
- 16 07 – Unsupervised learning: autoencoding the targets
- 17 08L – Self-supervised learning and variational inference
- 18 08 – From LV-EBM to target prop to (vanilla, denoising, contractive, variational) autoencoder
- 19 09L – Differentiable associative memories, attention, and transformers
- 20 09 – AE, DAE, and VAE with PyTorch; generative adversarial networks (GAN) and code
- 21 10L – Self-supervised learning in computer vision
- 22 10 – Self / cross, hard / soft attention and the Transformer
- 23 11L – Speech recognition and Graph Transformer Networks
- 24 11 – Graph Convolutional Networks (GCNs)
- 25 12L – Low resource machine translation
- 26 12 – Planning and control
- 27 13L – Optimisation for Deep Learning
- 28 13 – The Truck Backer-Upper
- 29 14L – Lagrangian backpropagation, final project winners, and Q&A session
- 30 14 – Prediction and Planning Under Uncertainty
- 31 AI2S Xmas Seminar - Dr. Alfredo Canziani (NYU) - Energy-Based Self-Supervised Learning