This title appears in the Scientific Report :
2017
Optimal sequence memory in driven random networks
Optimal sequence memory in driven random networks
Autonomous randomly coupled neural networks display a transition to chaos at a critical coupling strength. We here investigate the effect of a time-varying input on the onset of chaos and the resulting consequences for information processing. Dynamic mean-field theory yields the statistics of the ac...
Saved in:
Personal Name(s): | Schücker, Jannis (Corresponding author) |
---|---|
Goedeke, Sven / Helias, Moritz | |
Contributing Institute: |
Computational and Systems Neuroscience; INM-6 Jara-Institut Brain structure-function relationships; INM-10 Computational and Systems Neuroscience; IAS-6 |
Imprint: |
2017
|
Document Type: |
Preprint |
Research Program: |
Human Brain Project Specific Grant Agreement 1 Supercomputing and Modelling for the Human Brain Theory of multi-scale neuronal networks Theory, modelling and simulation |
Publikationsportal JuSER |
Autonomous randomly coupled neural networks display a transition to chaos at a critical coupling strength. We here investigate the effect of a time-varying input on the onset of chaos and the resulting consequences for information processing. Dynamic mean-field theory yields the statistics of the activity, the maximum Lyapunov exponent, and the memory capacity of the network. We find an exact condition that determines the transition from stable to chaotic dynamics and the sequential memory capacity in closed form. The input suppresses chaos by a dynamic mechanism, shifting the transition to significantly larger coupling strengths than predicted by local stability analysis. Beyond linear stability, a regime of coexistent locally expansive, but non-chaotic dynamics emerges that optimizes the capacity of the network to store sequential input. |