Overview
Syllabus
Intro
JOHNS HOPKINS UNIVERSITY
Spiking for tinyML
Batch Normalization Through Time (BNTT) for Temporal Learning
BNTT: Energy Efficiency & Robustness
Training SNNs for edge with heterogeneous demands
Spike Activation Map (SAM) for interpretable SNN
Spiking neurons are binary units with timed outputs
End-to-end training is key for artificial neural networks
Solution: Replace the true gradient with a surrogate gradient
Surrogate gradients self-calibrate neuromorphic systems when they can access the analog substrate variables
Fluctuation-driven initialization and bio-inspired homeostatic plasticity ensure optimal initialization
Holomorphic Equilibrium Propagation Computes Exact Gradients Through Finite Size Oscillations
Technical Program Committee
Taught by
tinyML