This title appears in the Scientific Report :
2020
Please use the identifier:
http://hdl.handle.net/2128/25881 in citations.
Transient chaotic dimensionality expansion by recurrent networks
Transient chaotic dimensionality expansion by recurrent networks
Neurons communicate with spikes, which are discrete events in time. Functional network models often employ rate units that are continuously coupled by analog signals. Is there a benefit of discrete signaling? By a unified mean-field theory we show that large random networks of rate and binary units...
Saved in:
Personal Name(s): | Keup, Christian (Corresponding author) |
---|---|
Kühn, Tobias / Dahmen, David / Helias, Moritz (Last author) | |
Contributing Institute: |
Computational and Systems Neuroscience; INM-6 Jara-Institut Brain structure-function relationships; INM-10 Computational and Systems Neuroscience; IAS-6 |
Published in: | S. 2002.11006 |
Imprint: |
2020
|
Document Type: |
Preprint |
Research Program: |
Doktorand ohne besondere Förderung Recurrence and stochasticity for neuro-inspired computation Theory of multi-scale neuronal networks Theory, modelling and simulation |
Link: |
OpenAccess |
Publikationsportal JuSER |
Neurons communicate with spikes, which are discrete events in time. Functional network models often employ rate units that are continuously coupled by analog signals. Is there a benefit of discrete signaling? By a unified mean-field theory we show that large random networks of rate and binary units have identical second order statistics. Yet their stimulus processing properties are radically different: We discover a chaotic sub-manifold in binary networks that does not exist in rate models. Its dimensionality increases with time after stimulus onset and reaches a fixed point depending on the synaptic coupling strength. Low dimensional stimuli are transiently expanded into higher-dimensional representations within this manifold. High noise resilience persists not only near the edge of chaos, but throughout the chaotic regime. In rate models of spiking activity, the effective spiking noise suppresses chaos, severely impairing classification performance. Chaotic rate networks without effective spiking noise also show the transient performance boost. The transitions to chaos in the two models do not coincide and have qualitatively different causes. Our theory mechanistically explains these observations. These findings have several implications. 1) Discrete state networks reach optimal performance with weaker synapses; implying lower energetic costs for synaptic transmission. 2) The classification mechanism is robust to noise, compatible with fluctuations in biophysical systems. 3) Optimal performance is reached after only a single activation per participating neuron; demonstrating event-based computation with short latencies. 4) The chaotic sub-manifold predicts a transient increase of variability after stimulus onset. Our results thus provide a hitherto unknown link between recurrent and chaotic dynamics of functional networks, neuronal variability, and dimensionality of neuronal responses. |