4.6 Article

Echoes in correlated neural systems

期刊

NEW JOURNAL OF PHYSICS
卷 15, 期 -, 页码 -

出版社

IOP PUBLISHING LTD
DOI: 10.1088/1367-2630/15/2/023002

关键词

-

资金

  1. Helmholtz Association: HASB
  2. Helmholtz Association: portfolio theme SMHB
  3. Next-Generation Supercomputer Project of MEXT
  4. EU grant (FACETS) [15879]
  5. EU grant (BrainScaleS) [269921]

向作者/读者索取更多资源

Correlations are employed in modern physics to explain microscopic and macroscopic phenomena, like the fractional quantum Hall effect and the Mott insulator state in high temperature superconductors and ultracold atoms. Simultaneously probed neurons in the intact brain reveal correlations between their activity, an important measure to study information processing in the brain that also influences the macroscopic signals of neural activity, like the electroencephalogram (EEG). Networks of spiking neurons differ from most physical systems: the interaction between elements is directed, time delayed, mediated by short pulses and each neuron receives events from thousands of neurons. Even the stationary state of the network cannot be described by equilibrium statistical mechanics. Here we develop a quantitative theory of pairwise correlations in finite-sized random networks of spiking neurons. We derive explicit analytic expressions for the population-averaged cross correlation functions. Our theory explains why the intuitive mean field description fails, how the echo of single action potentials causes an apparent lag of inhibition with respect to excitation and how the size of the network can be scaled while maintaining its dynamical state. Finally, we derive a new criterion for the emergence of collective oscillations from the spectrum of the time-evolution propagator.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

Article Physics, Multidisciplinary

Transient Chaotic Dimensionality Expansion by Recurrent Networks

Christian Keup, Tobias Kuehn, David Dahmen, Moritz Helias

Summary: This study explores the differences in chaos phenomena between rate and binary networks, finding qualitative distinctions in dynamics, chaotic characteristics, dimensionality expansion, etc.

PHYSICAL REVIEW X (2021)

Article Physics, Multidisciplinary

Large-Deviation Approach to Random Recurrent Neuronal Networks: Parameter Inference and Fluctuation-Induced Transitions

Alexander van Meegen, Tobias Kuehn, Moritz Helias

Summary: This study unifies the field-theoretical approach to neuronal networks with large deviations theory, deriving a rate function resembling Kullback-Leibler divergence through field theory to enable data-driven inference of model parameters and calculation of fluctuations beyond mean-field theory. Additionally, the study reveals a regime with fluctuation-induced transitions between mean-field solutions.

PHYSICAL REVIEW LETTERS (2021)

Article Neurosciences

Dynamical Characteristics of Recurrent Neuronal Networks Are Robust Against Low Synaptic Weight Resolution

Stefan Dasbach, Tom Tetzlaff, Markus Diesmann, Johanna Senk

Summary: This study explores the effects of limited synaptic weight resolution on the dynamics of spiking neuronal networks, finding that a naive discretization may distort spike-train statistics, but preserving the mean and variance of total synaptic input currents can maintain firing statistics for certain network types. Even with a discretization of synaptic weights, substantial deviations in firing statistics may occur, emphasizing the importance of careful validation and preservation of specific network characteristics.

FRONTIERS IN NEUROSCIENCE (2021)

Article Neurosciences

Simulating the Cortical Microcircuit Significantly Faster Than Real Time on the IBM INC-3000 Neural Supercomputer

Arne Heittmann, Georgia Psychou, Guido Trensch, Charles E. Cox, Winfried W. Wilcke, Markus Diesmann, Tobias G. Noll

Summary: This article utilizes the new IBM INC-3000 prototype FPGA-based neural supercomputer to implement a popular model of the cortical microcircuit, achieving a high speed-up factor and demonstrating the potential of FPGA systems for neural modeling.

FRONTIERS IN NEUROSCIENCE (2022)

Article Neurosciences

NEST Desktop, an Educational Application for Neuroscience

Sebastian Spreizer, Johanna Senk, Stefan Rotter, Markus Diesmann, Benjamin Weyers

Summary: The article introduces a web-based Graphical User Interface (GUI) for simulation software of spiking neuronal network models, reducing the entry barrier for students and early career scientists in computational neuroscience. The NEST Desktop tool, with graphical elements for creating, configuring, running, and analyzing network models, enhances the quality and intensity of teaching in computational neuroscience. The availability of this tool on public resources contributes to equal opportunities in the field.

ENEURO (2021)

Article Physics, Multidisciplinary

Gell-Mann-Low Criticality in Neural Networks

Lorenzo Tiberi, Jonas Stapmanns, Tobias Kuehn, Thomas Luu, David Dahmen, Moritz Helias

Summary: Criticality is closely linked to optimal computational capacity, but the lack of a renormalized theory of critical brain dynamics hinders insights into biological information processing. A renormalized theory of a prototypical neural field theory is presented, focusing on the flow of couplings across different length scales to achieve an effective trade-off between linearity and nonlinearity for information storage and computation.

PHYSICAL REVIEW LETTERS (2022)

Article Biochemical Research Methods

Sequence learning, prediction, and replay in networks of spiking neurons

Younes Bouhadjar, Dirk J. Wouters, Markus Diesmann, Tom J. Tetzlaff

Summary: This study proposes a biologically-based algorithm for sequence learning, prediction, and replay. The algorithm can continuously learn complex sequences in an unsupervised manner. The study also sheds light on the mechanisms of sequence processing speed and provides an explanation for the observed fast sequence replay in the hippocampus and neocortex.

PLOS COMPUTATIONAL BIOLOGY (2022)

Article Computer Science, Theory & Methods

Routing brain traffic through the von Neumann bottleneck: Efficient cache usage in spiking neural network simulation code on general purpose computers

J. Pronold, J. Jordan, B. J. N. Wylie, I Kitayama, M. Diesmann, S. Kunkel

Summary: Simulation is an important approach for studying complex dynamic systems, especially in the field of biological neural networks. This study demonstrates the effectiveness of common techniques in improving simulation accuracy and efficiency.

PARALLEL COMPUTING (2022)

Article Biochemical Research Methods

Connectivity concepts in neuronal network modeling

Johanna Senk, Birgit Kriener, Mikael Djurfeldt, Nicole Voges, Han-Jia Jiang, Lisa Schuettler, Gabriele Gramelsberger, Markus Diesmann, Hans E. Plesser, Sacha J. van Albada

Summary: Sustainable research on computational models of neuronal networks requires understandable, reproducible, and extendable published models. However, missing details or ambiguities in mathematical concepts, algorithmic implementations, or parameterizations hinder progress. This work aims to provide complete and concise descriptions of network connectivity and guide the implementation of connection routines in simulation software and neuromorphic hardware systems. A review of existing models reveals a substantial proportion of ambiguous descriptions. Based on this review, a set of connectivity concepts is derived and a unified graphical notation is proposed to facilitate intuitive understanding of network properties. The proposed standardizations are expected to contribute to unambiguous descriptions and reproducible implementations of neuronal network connectivity in computational neuroscience.

PLOS COMPUTATIONAL BIOLOGY (2022)

Article Biochemical Research Methods

Coherent noise enables probabilistic sequence replay in spiking neuronal networks

Younes S. Bouhadjar, Dirk J. Wouters, Markus S. Diesmann, Tom Tetzlaff

Summary: Humans and other animals benefit from exploring multiple alternative solutions to problems, rather than relying on a single optimal solution. This explorative behavior is often attributed to noise in neuronal dynamics. However, equipping a neuronal circuit with noise does not necessarily lead to explorative dynamics unless the noise is correlated within ensembles. This modeling study introduces configurable, locally coherent noise to create explorative behavior in a neuronal sequence-processing circuit, contributing to understanding the neuronal mechanisms underlying decision strategies and sequential memory recall.

PLOS COMPUTATIONAL BIOLOGY (2023)

Article Physics, Multidisciplinary

Statistical temporal pattern extraction by neuronal architecture

Sandra Nestler, Moritz Helias, Matthieu Gilson

Summary: The study demonstrates a simple biologically inspired feedforward neuronal model that utilizes higher-order temporal fluctuations for information extraction in time series classification. By training synaptic weights and a non-linear gain function, the study reveals how nonlinearity allows for the transfer of higher-order correlations and maximization of classification accuracy. The experimental results highlight the advantage of the biologically inspired architecture in utilizing the number of trainable parameters compared to classical machine learning approaches.

PHYSICAL REVIEW RESEARCH (2023)

Article Physics, Multidisciplinary

Decomposing neural networks as mappings of correlation functions

Kirsten Fischer, Alexandre Rene, Christian Keup, Moritz Layer, David Dahmen, Moritz Helias

Summary: Understanding the functional principles of information processing in deep neural networks with trained weights is a challenge. This study focuses on the mapping between probability distributions implemented by a deep feed-forward network, characterizing it as an iterated transformation of distributions. The analysis reveals that correlations up to second order play a key role in capturing information processing in internal layers of the network.

PHYSICAL REVIEW RESEARCH (2022)

Article Engineering, Electrical & Electronic

Sub-realtime simulation of a neuronal network of natural density

Anno C. Kurth, Johanna Senk, Dennis Terhorst, Justin Finnerty, Markus Diesmann

Summary: Simulating full-scale neuronal network models of the brain is challenging, but this study achieved shorter run times than the simulated biological time span by using a recent conventional compute node. Realtime performance is important for robotics and closed-loop applications, while sub-realtime performance is desirable for studying learning and development in the brain, which takes hours and days.

NEUROMORPHIC COMPUTING AND ENGINEERING (2022)

Article Physics, Multidisciplinary

Global hierarchy vs local structure: Spurious self-feedback in scale-free networks

Claudia Merger, Timo Reinartz, Stefan Wessel, Carsten Honerkamp, Andreas Schuppert, Moritz Helias

Summary: Networks with fat-tailed degree distributions often have hubs, nodes with high numbers of connections, crucial to the transition into a globally ordered network state. Higher order interaction effects counteract the self-feedback on hubs, highlighting their importance for the distinct onset of local versus global order in the network. This mechanism may be relevant for other systems with a strongly hierarchical underlying network structure.

PHYSICAL REVIEW RESEARCH (2021)

暂无数据