This title appears in the Scientific Report :
2021
Sequence learning, prediction, and generation in networks of spiking neurons
Sequence learning, prediction, and generation in networks of spiking neurons
Sequence learning, prediction and generation has been proposed to be the universal computation performed by the neocortex. The Hierarchical Temporal Memory (HTM) algorithm realizes this form of computation. It learns sequences in an unsupervised and continuous manner using local learning rules, perm...
Saved in:
Personal Name(s): | Bouhadjar, Younes (Corresponding author) |
---|---|
Diesmann, Markus / Wouters, Dirk J. / Tetzlaff, Tom | |
Contributing Institute: |
JARA Institut Green IT; PGI-10 Elektronische Materialien; PGI-7 Jara-Institut Brain structure-function relationships; INM-10 Computational and Systems Neuroscience; IAS-6 Computational and Systems Neuroscience; INM-6 |
Imprint: |
2021
|
Conference: | NEST Conference, Online (Online), 2021-06-28 - 2021-06-29 |
Document Type: |
Poster |
Research Program: |
Human Brain Project Specific Grant Agreement 3 Advanced Computing Architectures Computational Principles Theory, modelling and simulation |
Publikationsportal JuSER |
Sequence learning, prediction and generation has been proposed to be the universal computation performed by the neocortex. The Hierarchical Temporal Memory (HTM) algorithm realizes this form of computation. It learns sequences in an unsupervised and continuous manner using local learning rules, permits a context-specific prediction of future sequence elements, and generates mismatch signals in case the predictions are not met. While the HTM algorithm accounts for a number of biological features such as topographic receptive fields, nonlinear dendritic processing, and sparse connectivity, it is based on abstract discrete-time neuron and synapse dynamics, as well as on plasticity mechanisms that can only partly be related to known biological mechanisms. Here, we devise a continuous-time implementation of the temporal-memory (TM) component of the HTM algorithm, which is based on a recurrent network of spiking neurons with biophysically interpretable variables and parameters. The model learns high-order sequences by means of a structural Hebbian synaptic plasticity mechanism supplemented with a rate-based homeostatic control. In combination with nonlinear dendritic input integration and local inhibitory feedback, this type of plasticity leads to the dynamic self-organization of narrow sequence-specific feedforward subnetworks. These subnetworks provide the substrate for a faithful propagation of sparse, synchronous activity, and, thereby, for a robust, context-specific prediction of future sequence elementsas well as for the autonomous replay of previously learned sequences. By strengthening the link to biology, our implementation facilitates the evaluation of the TM hypothesis based on experimentally accessible quantities.The continuous-time implementation of the TM algorithm permits, in particular, an investigation of the role ofsequence timing for sequence learning, prediction and replay. We demonstrate this aspect by studying the effectof the sequence speed on the sequence learning performance and on the speed of autonomous sequence replay. |