This title appears in the Scientific Report :
2017
Transition to chaos and signal response in driven random neural networks
Transition to chaos and signal response in driven random neural networks
Recurrent networks of randomly coupled rate neurons display a transition to chaos at a critical coupling strength [1]. Their rich internal dynamics emerging near the transition has been associated with optimal information processing capabilities [2]. In particular, the dynamics becomes arbitrary slo...
Saved in:
Personal Name(s): | Goedeke, Sven |
---|---|
Schücker, Jannis (Corresponding author) / Helias, Moritz | |
Contributing Institute: |
Computational and Systems Neuroscience; IAS-6 Computational and Systems Neuroscience; INM-6 |
Imprint: |
2017
|
Conference: | Dynamical Network States, Criticality and Cortical Function, Delmenhorst (Germany), 2017-03-25 - 2017-03-28 |
Document Type: |
Poster |
Research Program: |
Supercomputing and Modelling for the Human Brain Human Brain Project Specific Grant Agreement 1 Theory of multi-scale neuronal networks Theory, modelling and simulation |
Publikationsportal JuSER |
Recurrent networks of randomly coupled rate neurons display a transition to chaos at a critical coupling strength [1]. Their rich internal dynamics emerging near the transition has been associated with optimal information processing capabilities [2]. In particular, the dynamics becomes arbitrary slow at the onset of chaos similar to 'critical slowing down'. However, the interplay between a time-dependent signal, the dynamics of the network and the resulting consequences for the information processing capabilities are poorly understood.We here investigate the effect of time-varying inputs on the phase diagram of the network. In particular, using dynamic mean-field theory we study the largest Lyapunov exponent, which quantifies the rate of exponential divergence or convergence of close-by trajectories. We analytically determine the transition to chaos as a function of coupling strength and input amplitude. The transition is shifted to significantly larger coupling strengths than predicted by linear stability analysis of the local Jacobian matrix. This displacement leads to the emergence of a novel dynamical regime, which combines locally expansive dynamics with asymptotic stability. Moreover, we show that the slow internal dynamics are strongly suppressed by the external time- varying drive.To study signal processing capabilities we evaluate the capacity to reconstruct a past input from a linear readout applied to the present state, the so-called memory curve [3]. We find that for a given signal amplitude the memory capacity peaks within the novel dynamical regime. This result indicates that locally expanding while asymptotically stable dynamics is beneficial to store information about the input in the dynamics of the neural network.[1] H. Sompolinsky, A. Crisanti, and H. J. Sommers, Phys. Rev. Lett. 61, 259 (1988).[2] N. Bertschinger and T. Natschläger, Neural Comput. 16, 1413 (2004).[3] H. Jaeger, Short term memory in echo state networks, vol. 5 (GMD-Forschungszentrum Informationstechnik, 2001). |