期刊
NEUROCOMPUTING
卷 90, 期 -, 页码 96-105出版社
ELSEVIER SCIENCE BV
DOI: 10.1016/j.neucom.2012.01.032
关键词
Reservoir computing; Output feedback; Regularization; Stability
Output feedback is crucial for autonomous and parameterized pattern generation with reservoir networks. Read-out learning affects the output feedback loop and can lead to error amplification. Regularization is therefore important for both generalization and reduction of error amplification. We show that regularization of the reservoir and the read-out layer reduces the risk of error amplification, mitigates parameter dependency and boosts the task-specific performance of reservoir networks with output feedback. We discuss the deeper connection between regularization of the learning process and stability of the trained network. (C) 2012 Elsevier B.V. All rights reserved.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据