Completed
Batch Normalization Through Time (BNTT) for Temporal Learning
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Neuromorphic Engineering Algorithms for Edge ML and Spiking Neural Networks
Automatically move to the next video in the Classroom when playback concludes
- 1 Intro
- 2 JOHNS HOPKINS UNIVERSITY
- 3 Spiking for tinyML
- 4 Batch Normalization Through Time (BNTT) for Temporal Learning
- 5 BNTT: Energy Efficiency & Robustness
- 6 Training SNNs for edge with heterogeneous demands
- 7 Spike Activation Map (SAM) for interpretable SNN
- 8 Spiking neurons are binary units with timed outputs
- 9 End-to-end training is key for artificial neural networks
- 10 Solution: Replace the true gradient with a surrogate gradient
- 11 Surrogate gradients self-calibrate neuromorphic systems when they can access the analog substrate variables
- 12 Fluctuation-driven initialization and bio-inspired homeostatic plasticity ensure optimal initialization
- 13 Holomorphic Equilibrium Propagation Computes Exact Gradients Through Finite Size Oscillations
- 14 Technical Program Committee