4.2 Article

Superior Performance of Using Hyperbolic Sine Activation Functions in ZNN Illustrated via Time-Varying Matrix Square Roots Finding

期刊

COMPUTER SCIENCE AND INFORMATION SYSTEMS
卷 9, 期 4, 页码 1603-1625

出版社

COMSIS CONSORTIUM
DOI: 10.2298/CSIS120121043Z

关键词

Zhang neural network; global exponential convergence; hyperbolic sine activation functions; time-varying matrix square roots; implementation errors

向作者/读者索取更多资源

A special class of recurrent neural network, termed Zhang neural network (ZNN) depicted in the implicit dynamics, has recently been proposed for online solution of time-varying matrix square roots. Such a ZNN model can be constructed by using monotonically-increasing odd activation functions to obtain the theoretical time-varying matrix square roots in an error-free manner. Different choices of activation function arrays may lead to different performance of the ZNN model. Generally speaking, ZNN model using hyperbolic sine activation functions may achieve better performance, as compared with those using other activation functions. In this paper, to pursue the superior convergence and robustness properties, hyperbolic sine activation functions are applied to the ZNN model for online solution of time-varying matrix square roots. Theoretical analysis and computer-simulation results further demonstrate the superior performance of the ZNN model using hyperbolic sine activation functions in the context of large model-implementation errors, in comparison with that using linear activation functions.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.2
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据