Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the potential of spintronics in hardware neural networks for fast and energy-efficient machine learning through this 59-minute APS Physics journal club presentation. Delve into the challenges of device-to-device variations in large-scale neuromorphic systems and discover how in situ learning of weights and biases in a Boltzmann machine can address these issues. Learn about a scalable, autonomously operating learning circuit using spintronics-based neurons, designed for standalone AI devices capable of efficient edge learning. Join Jan Kaiser from Purdue University as he discusses his team's recent study published in Physical Review Applied, demonstrating the ability to counter variability and learn probability distributions for meaningful operations like a full adder. Gain insights into physics-inspired computing, probabilistic bits (pbits), magnetic tunnel junctions, and the potential for energy savings in this cutting-edge field. The presentation is followed by a Q&A session moderated by Dr. Matthew Daniels from NIST-Gaithersburg, covering topics such as scaling, mixed-signal circuits, analog noise, improved magnetic tunnel junctions, memory considerations, and algorithm mapping.
Syllabus
Introduction
Welcome
PhysicsInspired Computing
Pbits
Magnetic Tunnel Junction
probabilistic computer
variability concerns
machine learning
summary
QA
Tanner
Thomas
Scaling
Removing the digital intermediary
Mixed signal circuit
Analog noise
Better MTJs
Memory
Algorithms
Keele divergence
Mapping algorithms
More questions
Energy savings
Taught by
APS Physics