期刊
IEEJ TRANSACTIONS ON ELECTRICAL AND ELECTRONIC ENGINEERING
卷 7, 期 4, 页码 397-401出版社
WILEY
DOI: 10.1002/tee.21746
关键词
neural networks; quaternion; learning; backpropagation algorithm
The quaternary neural network (QNN) proposed by Nitta is a high-dimensional neural network. Nitta showed that its learning is faster than that of ordinary neural networks and the number of required parameters is almost one-third of that of real-valued neural networks by computer simulations. In this paper, we propose the twisted quaternary neural network (TQNN) which modifies the directions of multiplications of the QNN. Since quaternions are noncommutative on multiplication, we can get another neural network. When the activation function is linear, multilayered neural networks can be expressed by single-layered neural networks. But the TQNN cannot be expressed by a single-layered QNN even if the activation function is linear. Therefore, the TQNN is expected to produce a variety of signal-processing systems. We performed computer simulations to compare the QNN and the TQNN. Then we found that the latter's learning is a little faster. Moreover, computer simulation showed that the QNN tended to be trapped in local minima or plateaus but the TQNN did not. It is said that reducibility causes local minima and plateaus. We discuss the differences of reducibility between the QNN and the TQNN as well. (c) 2012 Institute of Electrical Engineers of Japan. Published by John Wiley & Sons, Inc.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据