Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a detailed explanation of a research paper on bio-inspired bistable recurrent cells for long-lasting memory in neural networks. Dive into the limitations of LSTMs and GRUs in maintaining long-term memory, and discover how the concept of neuronal bistability from biology is applied to create a new recurrent cell architecture. Learn about the structure and advantages of this bistable recurrent cell, including its ability to store information at the cellular level for extended periods. Examine the implementation of neuromodulation in this new architecture and its connection to standard GRU cells. Analyze the performance of this novel approach through two benchmarks: the Copy First task and the Denoising task. Gain insights into the potential implications of this research for improving long-term memory in recurrent neural networks and its relevance to biological plausibility in artificial neural networks.
Syllabus
- Intro & Overview
- Recurrent Neural Networks
- Gated Recurrent Unit
- Neuronal Bistability
- Bistable Recurrent Cell
- Neuromodulation
- Copy First Benchmark
- Denoising Benchmark
- Conclusion & Comments
Taught by
Yannic Kilcher