This title appears in the Scientific Report :
2021
Please use the identifier:
http://dx.doi.org/10.12751/NNCN.BC2020.0272 in citations.
RateML, a spin-off of the NeuroML and LEMS Domain Specific Languages, tailored to generate rate-based-models suited for simulators such as the Virtual Brain (TVB) featuring high performance computing and parameter sweep capabilities.
RateML, a spin-off of the NeuroML and LEMS Domain Specific Languages, tailored to generate rate-based-models suited for simulators such as the Virtual Brain (TVB) featuring high performance computing and parameter sweep capabilities.
With this poster we present RateML, a spin-off of the NeuroML and LEMS Domain Specific Languages, tailored to generate rate-based-models suited for simulators such as the Virtual Brain (TVB) featuring high performance computing and parameter sweep capabilities. RateML has been developed to abstract...
Saved in:
Personal Name(s): | van der Vlag, Michiel (Corresponding author) |
---|---|
Diaz, Sandra / Woodman, Marmaduje / Fousek, Jan / Peyser, Alexander / Jirsa, Viktor | |
Contributing Institute: |
Jülich Supercomputing Center; JSC |
Imprint: |
2020
|
DOI: |
10.12751/NNCN.BC2020.0272 |
Conference: | Bernstein Conference, Online (Germany), 2020-09-29 - 2020-10-01 |
Document Type: |
Poster |
Research Program: |
SimLab Neuroscience Domain-Specific Simulation & Data Life Cycle Labs (SDLs) and Research Groups |
Subject (ZB): | |
Publikationsportal JuSER |
With this poster we present RateML, a spin-off of the NeuroML and LEMS Domain Specific Languages, tailored to generate rate-based-models suited for simulators such as the Virtual Brain (TVB) featuring high performance computing and parameter sweep capabilities. RateML has been developed to abstract modelling from the implementation or deployment on hardware of the to-be-simulated TVB brain model. Using RateML, code can be produced targeting different target languages and exploiting the computational capabilities of specific computing paradigms and hardware.RateML is based on the existing domain specific language 'LEMS'. Low Entropy Model Specification (LEMS) is an XML based language for specifying generic models of hybrid dynamical systems which extends a sibling language, ‘NeuroML’ (NeuroML), by providing representations for variation in cell dynamics in time; in other words, equations for cell dynamics. It enables users to generate rate-based brain models from an XML file, in which the generic features of TVB models can be addressed without needing extended knowledge regarding the optimal programming or simulation of such models. In figure 1 an example of a Kuramoto model in XML is shown.A TVB simulation often entails the exploration of many parameters to fit the simulated dynamics to empirical data e.g. EEG/MRI data. These big data exploration simulations are best aided by a high-performance compute solution. As well as regular (Python) TVB model generation, CUDA code can be generated in which certain variables can be designated with a specific range for parameter exploration. Such extensive parameter exploration can then be executed with a high degree of parallelization on a GPU. For example, for a brain model with 68 nodes, in a single kernel invocation on a V100 GPU, it is possible to simulate roughly 30,000 parallel instances using the Kuramoto model exploring the combinations of 173 coupling and 173 speed parameters in seconds. For the Epileptor, a model which is very memory demanding due to the fact that it has 6 state variables, roughly 5,000 parameter combinations can be explored in a single kernel. The models used in these experiments are generated by RateML.Thus, RateML can produce a) Python code compatible with the TVB framework, b) CUDA code which can be run directly on GPUs to perform high performance parameter fitting, and c) in the future code for Bayesian inversion. |