期刊
FRONTIERS IN COMPUTATIONAL NEUROSCIENCE
卷 15, 期 -, 页码 -出版社
FRONTIERS MEDIA SA
DOI: 10.3389/fncom.2021.678158
关键词
recurrent neural network; dynamical systems; continuous time; bifurcations; time-series
资金
- NIH [EB-026946]
- NSF [IIS-1845836]
- Institute of Advanced Computational Science Jr. Researcher Fellowship, at Stony Brook University
The study reveals the complexity of internal dynamics in gated recurrent unit networks, including nonlinear oscillations, multi-stable dynamics, and saddle-node bifurcations, while being unable to train the network to produce continuous attractors. These findings provide insights into understanding the working principles and performance of GRU networks.
Gated recurrent units (GRUs) are specialized memory elements for building recurrent neural networks. Despite their incredible success on various tasks, including extracting dynamics underlying neural data, little is understood about the specific dynamics representable in a GRU network. As a result, it is both difficult to know a priori how successful a GRU network will perform on a given task, and also their capacity to mimic the underlying behavior of their biological counterparts. Using a continuous time analysis, we gain intuition on the inner workings of GRU networks. We restrict our presentation to low dimensions, allowing for a comprehensive visualization. We found a surprisingly rich repertoire of dynamical features that includes stable limit cycles (nonlinear oscillations), multi-stable dynamics with various topologies, and homoclinic bifurcations. At the same time we were unable to train GRU networks to produce continuous attractors, which are hypothesized to exist in biological neural networks. We contextualize the usefulness of different kinds of observed dynamics and support our claims experimentally.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据