TinyML Talks - SRAM Based In-Memory Computing for Energy-Efficient AI Inference

TinyML Talks - SRAM Based In-Memory Computing for Energy-Efficient AI Inference

tinyML via YouTube Direct link

Acknowledgements

21 of 22

21 of 22

Acknowledgements

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

TinyML Talks - SRAM Based In-Memory Computing for Energy-Efficient AI Inference

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Intro
  2. 2 ML collaboration with
  3. 3 Success of Deep Learning / AI
  4. 4 AI Algorithm & Edge Hardware
  5. 5 Typical DNN Accelerators
  6. 6 Eyeriss (JSSC 2017)
  7. 7 MCM Accelerator (JSSC 2020)
  8. 8 Bottleneck of All-Digital DNN HW Energy/Power
  9. 9 In-Memory Computing for DNNS
  10. 10 Analog IMC for SRAM Column
  11. 11 Analog SRAM IMC - Resistive
  12. 12 Analog SRAM IMC - Capacitive
  13. 13 ADC Optimization for IMC
  14. 14 Proposed IMC SRAM Macro Prototypes
  15. 15 Going Beyond IMC Macro Design
  16. 16 PIMCA: Programmable IMC Accelerator
  17. 17 IMC Modeling Framework
  18. 18 IMC HW Noise-Aware Training & Inference
  19. 19 Black-box Adversarial Input Attack
  20. 20 Pruning of Crossbar-based IMC Hardware
  21. 21 Acknowledgements
  22. 22 Contact Information

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.