4.5 Article

Gated Recurrent Units Viewed Through the Lens of Continuous Time Dynamical Systems

期刊

出版社

FRONTIERS MEDIA SA
DOI: 10.3389/fncom.2021.678158

关键词

recurrent neural network; dynamical systems; continuous time; bifurcations; time-series

资金

  1. NIH [EB-026946]
  2. NSF [IIS-1845836]
  3. Institute of Advanced Computational Science Jr. Researcher Fellowship, at Stony Brook University

向作者/读者索取更多资源

The study reveals the complexity of internal dynamics in gated recurrent unit networks, including nonlinear oscillations, multi-stable dynamics, and saddle-node bifurcations, while being unable to train the network to produce continuous attractors. These findings provide insights into understanding the working principles and performance of GRU networks.
Gated recurrent units (GRUs) are specialized memory elements for building recurrent neural networks. Despite their incredible success on various tasks, including extracting dynamics underlying neural data, little is understood about the specific dynamics representable in a GRU network. As a result, it is both difficult to know a priori how successful a GRU network will perform on a given task, and also their capacity to mimic the underlying behavior of their biological counterparts. Using a continuous time analysis, we gain intuition on the inner workings of GRU networks. We restrict our presentation to low dimensions, allowing for a comprehensive visualization. We found a surprisingly rich repertoire of dynamical features that includes stable limit cycles (nonlinear oscillations), multi-stable dynamics with various topologies, and homoclinic bifurcations. At the same time we were unable to train GRU networks to produce continuous attractors, which are hypothesized to exist in biological neural networks. We contextualize the usefulness of different kinds of observed dynamics and support our claims experimentally.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据