Journal
IEEE JOURNAL ON EMERGING AND SELECTED TOPICS IN CIRCUITS AND SYSTEMS
Volume 10, Issue 4, Pages 522-535Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/JETCAS.2020.3040248
Keywords
Neurons; Memristors; Neuromorphics; Training; Neural networks; Synapses; On-chip learning; surrogate gradient (SG) learning; Spiking Neural Networks (SNNs); memristive devices
Categories
Funding
- National Science Foundation [1652159, 1823366]
- Direct For Computer & Info Scie & Enginr [1652159] Funding Source: National Science Foundation
- Division Of Computer and Network Systems
- Direct For Computer & Info Scie & Enginr [1823366] Funding Source: National Science Foundation
- Division of Computing and Communication Foundations [1652159] Funding Source: National Science Foundation
Ask authors/readers for more resources
Recent breakthroughs in neuromorphic computing show that local forms of gradient descent learning are compatible with Spiking Neural Networks (SNNs) and synaptic plasticity. Although SNNs can be scalably implemented using neuromorphic VLSI, an architecture that can learn using gradient-descent in situ is still missing. In this paper, we propose a local, gradient-based, error-triggered learning algorithm with online ternary weight updates. The proposed algorithm enables online training of multi-layer SNNs with memristive neuromorphic hardware showing a small loss in the performance compared with the state-of-the-art. We also propose a hardware architecture based on memristive crossbar arrays to perform the required vector-matrix multiplications. The necessary peripheral circuitry including presynaptic, post-synaptic and write circuits required for online training, have been designed in the subthreshold regime for power saving with a standard 180 nm CMOS process.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available