4.6 Article

Efficient training of supervised spiking neural networks via the normalized perceptron based learning rule

期刊

NEUROCOMPUTING
卷 241, 期 -, 页码 152-163

出版社

ELSEVIER
DOI: 10.1016/j.neucom.2017.01.086

关键词

Spiking neural networks; Temporal encoding mechanism; Supervised learning; Perceptron based learning rule

向作者/读者索取更多资源

The spiking neural networks (SNNs) are the third generation of artificial neural networks, which have made great achievements in the field of pattern recognition. However, the existing supervised training methods of SNNs are not efficient enough to meet the real-time requirement in most cases. To address this issue, the normalized perceptron based learning rule (NPBLR) is proposed in this paper for the supervised training of the multi-layer SNNs. Different from traditional methods, our algorithm only trains the selected misclassified time points and the target ones, employing the perceptron based neuron. Furthermore, the weight modification in our algorithm is normalized by a voltage based function, which is more efficient than the traditional time based method because the firing time is calculated by the voltage value. Superior to the traditional multi-layer algorithm ignoring the time accumulation of spikes, our algorithm defines the spiking activity of the postsynaptic neuron as the rate accumulation function of all presynaptic neurons in a specific time-frame. By these strategies, our algorithm overcomes some difficulties in the training of SNNs, e.g., the inefficient and no-fire problems. Comprehensive simulations are conducted both in single and multi-layer networks to investigate the learning performance of our algorithm, whose results demonstrate that our algorithm possesses higher learning efficiency and stronger parameter robustness than traditional algorithms. (C) 2017 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据