This title appears in the Scientific Report :
2021
Fluctuations, correlations, chaos: dynamics and computation in recurrent networks
Fluctuations, correlations, chaos: dynamics and computation in recurrent networks
The remarkable properties of information-processing by biological and artificial neuronal networks arise from the interaction of large numbers of neurons. A central quest is thus to characterize their collective states. The directed coupling between pairs of neurons and their continuous dissipation...
Saved in:
Personal Name(s): | Helias, Moritz (Corresponding author) |
---|---|
van Meegen, Alexander / Dahmen, David / Keup, Christian / Nestler, Sandra | |
Contributing Institute: |
Computational and Systems Neuroscience; INM-6 Computational and Systems Neuroscience; IAS-6 Jara-Institut Brain structure-function relationships; INM-10 |
Imprint: |
2021
|
Conference: | MILA Seminar, online (Canada), |
Document Type: |
Talk (non-conference) |
Research Program: |
Advanced Computing Architectures Transparent Deep Learning with Renormalized Flows Recurrence and stochasticity for neuro-inspired computation Human Brain Project Specific Grant Agreement 3 Human Brain Project Specific Grant Agreement 2 Towards an integrated data science of complex natural systems Theory of multi-scale neuronal networks Emerging NC Architectures Computational Principles Neuroscientific Foundations |
Publikationsportal JuSER |
The remarkable properties of information-processing by biological and artificial neuronal networks arise from the interaction of large numbers of neurons. A central quest is thus to characterize their collective states. The directed coupling between pairs of neurons and their continuous dissipation of energy, moreover, cause dynamics of neuronal networks outside thermodynamic equilibrium. Tools from non-equilibrium statistical mechanics and field theory are thus useful to obtain a quantitative understanding. We here present recent progress using such approaches [1].We show how activity in large, random networks can be described by a unified approach of path-integrals and large deviation theory that allows the inference of parameters from data and the prediction of future activity [2]. This approach also allows one to quantify fluctuations around the mean-field theory. These are important to understand why correlations observed between pairs of neurons indicate dynamics of cortical networks that are poised near a critical point [3]. Close to this transition, we find chaotic dynamics and prolonged sequential memory for past signals [4]. In the chaotic regime, networks offer representations of information whose dimensionality expands with time. We show how this mechanism aids classification performance [5]. Performance in such settings of reservoir computing, moreover, sensitively depends on the way information is fed into the network. Formally unrolling recurrence with the help of Green‘s functions yields a controlled practical method to optimize reservoir computing [6].Together these works illustrate the fruitful interplay between theoretical physics, neuronal networks, and neural information processing.References: 1. Helias, Dahmen (2020) Statistical field theory for neural networks. Springer lecture notes in physics.2. Meegen, Kuehn, Helias (2020) Large Deviation Approach to Random Recurrent Neuronal Networks: Rate Function, Parameter Inference, and Activity Prediction arXiv:2009.088893. Dahmen, Grün, Diesmann, Helias (2019). Second type of criticality in the brain uncovers rich multiple-neuron dynamics. PNAS 116 (26) 13051-130604. Schuecker J, Goedeke S, Helias M (2018). Optimal sequence memory in driven random networks. Phys Rev X 8, 0410295. Keup, Kuehn, Dahmen, Helias (2020) Transient chaotic dimensionality expansion by recurrent networks. arXiv:2002.110066. Nestler, Keup, Dahmen, Gilson, Rauhut, Helias (2020) Unfolding recurrence by Green's functions for optimized reservoir computing. In Advances in Neural Information Processing Systems 33 (NeurIPS 2020) |