4.3 Article

Twisted Quaternary Neural Networks

出版社

WILEY
DOI: 10.1002/tee.21746

关键词

neural networks; quaternion; learning; backpropagation algorithm

向作者/读者索取更多资源

The quaternary neural network (QNN) proposed by Nitta is a high-dimensional neural network. Nitta showed that its learning is faster than that of ordinary neural networks and the number of required parameters is almost one-third of that of real-valued neural networks by computer simulations. In this paper, we propose the twisted quaternary neural network (TQNN) which modifies the directions of multiplications of the QNN. Since quaternions are noncommutative on multiplication, we can get another neural network. When the activation function is linear, multilayered neural networks can be expressed by single-layered neural networks. But the TQNN cannot be expressed by a single-layered QNN even if the activation function is linear. Therefore, the TQNN is expected to produce a variety of signal-processing systems. We performed computer simulations to compare the QNN and the TQNN. Then we found that the latter's learning is a little faster. Moreover, computer simulation showed that the QNN tended to be trapped in local minima or plateaus but the TQNN did not. It is said that reducibility causes local minima and plateaus. We discuss the differences of reducibility between the QNN and the TQNN as well. (c) 2012 Institute of Electrical Engineers of Japan. Published by John Wiley & Sons, Inc.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.3
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据