This title appears in the Scientific Report :
2019
Please use the identifier:
http://hdl.handle.net/2128/22679 in citations.
Optimized Reservoir Computing with Stochastic Recurrent Networks
Optimized Reservoir Computing with Stochastic Recurrent Networks
Cortical networks are strongly recurrent, and neurons have intrinsic temporal dynamics. This sets them apart from deep networks. Reservoir computing [1,2] is an approach that takes these features into account. Inputs are here mapped into a high dimensional space spanned by a large number of typicall...
Saved in:
Personal Name(s): | Nestler, Sandra (Corresponding author) |
---|---|
Keup, Christian / Dahmen, David / Helias, Moritz | |
Contributing Institute: |
Jara-Institut Brain structure-function relationships; INM-10 Computational and Systems Neuroscience; IAS-6 Computational and Systems Neuroscience; INM-6 |
Imprint: |
2019
|
Conference: | CNS 2019 Barcelona, Barcelona (Spain), 2019-07-13 - 2019-07-17 |
Document Type: |
Poster |
Research Program: |
Doktorand ohne besondere Förderung Human Brain Project Specific Grant Agreement 2 Theory of multi-scale neuronal networks Connectivity and Activity Theory, modelling and simulation |
Link: |
OpenAccess |
Publikationsportal JuSER |
Cortical networks are strongly recurrent, and neurons have intrinsic temporal dynamics. This sets them apart from deep networks. Reservoir computing [1,2] is an approach that takes these features into account. Inputs are here mapped into a high dimensional space spanned by a large number of typically randomly connected neurons; the network acts like a kernel in a support vector machine. Functional tasks on the time-dependent inputs are realized by training a linear readout of the network activity.It has been extensively studied how the performance of the reservoir depends on the properties of the recurrent connectivity; the edge of chaos has been found as a global indicator of good computational properties [3,4].However, the interplay of recurrence, nonlinearities, and stochastic neuronal dynamics may offer optimal settings that are not described by such global parameters alone. We here set out to systematically analyze the kernel properties of recurrent time-continuous stochastic networks in a binary time series classification task. We derive a learning rule that maximizes the classification margin. The interplay between the signal and neuronal noise determines a single optimal readout direction. Finding this direction does not require a training process; it can be directly calculated from the network statistics. This technique is reliable and yields a measure of linear separability that we use to optimize the remainder of the network. We show that the classification performance crucially depends on the input projection; random projections will lead to significantly suboptimal readouts.We generalize these results to nonlinear networks. With field theoretical methods [5] we derive systematic corrections due to neuronal nonlinearities, which decompose the recurrent network into an effective bilinear time-dependent kernel. The expressions expose how the network dynamics separates a priori linearly non-separable time-series, and thus explain how recurrent nonlinear networks acquire capabilities beyond a linear perceptron.Acknowledgements:Partly supported by HGF young investigator's group VH-NG-1028 and European Union Horizon 2020 grant 785907 (Human Brain Project SGA2). 1. Maass W, Natschlaeger T, Markram H. Real-time computing without stable states: A new framework for neural computation based on perturbations. Neural Comput. 2002, 2531-2560. 2. Jaeger H, Haas H. Harnessing nonlinearity: Predicting chaotic systems and saving energy in wireless communication. Science. 2004, 304, 78-80 3. Bertschinger N, Natschlaeger T, Legenstein R. At the Edge of Chaos: Real-time Computations and Self-Organized Criticality in Recurrent Neural Networks. Conference: Advances in Neural Information Processing Systems 17 NIPS 4. Toyoizumi T, Abbott L. Beyond the edge of chaos: Amplification and temporal integration by recurrent networks in the chaotic regime. Phys. Rev. E. 2004, 84, 051908 5. Helias M, Dahmen D. Statistical field theory for neural networks. 2019, arXiv:1901.10416 |