This title appears in the Scientific Report :
2023
A statistical perspective on learning of time series in neural networks
A statistical perspective on learning of time series in neural networks
In this talk, we explore a statistical perspective on learning in neural networks, drawing inspiration from both neuroscience and machine learning. We investigate the stochastic nature of neural activity and stimuli and utilize tools from statistical physics to address these aspects. The focus lies...
Saved in:
In this talk, we explore a statistical perspective on learning in neural networks, drawing inspiration from both neuroscience and machine learning. We investigate the stochastic nature of neural activity and stimuli and utilize tools from statistical physics to address these aspects. The focus lies on the time-dependent processing of stimuli.Recurrent neural networks, a concept inspired by the brain, handle time series naturally. For weakly non-linear interactions, a method is developed to approximate network dynamics, leading to improved performance in a random recurrent reservoir. For the scenario of linear interactions, we investigate how the optimal classifier balances stability and performance in the presence of background noise.We then study how non-linear interactions shape the statistical processing of stimuli, demonstrating a direct relationship between non-linearity, representation, and higher-order statistics using a single-layer perceptron. Moreover, we explore learning the data distribution itself, employing an invertible neural network (normalizing flows) to extract informative modes. This unsupervised approach uncovers underlying structure, dimensionality, and meaningful latent features in the data. |