This title appears in the Scientific Report :
2019
Please use the identifier:
http://hdl.handle.net/2128/23087 in citations.
Supercomputers for Deep Learning: An introduction
Supercomputers for Deep Learning: An introduction
For certain problems, training deep artificial neural networks can require far more compute resources than are typically available on a workstation or laptop. The supercomputers at the JSC provide the required resources for such problems, and are in fact regularly used by scientists for efficient tr...
Saved in:
Personal Name(s): | Khalid, Fahad (Corresponding author) |
---|---|
Contributing Institute: |
Jülich Supercomputing Center; JSC |
Imprint: |
2019
|
Conference: | Eighth Annual Retreat of the Institute of Neuroscience and Medicine and the Institute of Complex Systems, Jülich (Germany), 2019-06-25 - 2019-06-26 |
Document Type: |
Talk (non-conference) |
Research Program: |
SimLab Neuroscience Computational Science and Mathematical Methods |
Link: |
OpenAccess OpenAccess |
Publikationsportal JuSER |
For certain problems, training deep artificial neural networks can require far more compute resources than are typically available on a workstation or laptop. The supercomputers at the JSC provide the required resources for such problems, and are in fact regularly used by scientists for efficient training and inference. But how does one get started with training deep learning models on supercomputers? This talk will walk the audience through the process of setting up and executing a deep learning project on the JSC supercomputers. We will focus on distributed training and inference with the most commonly used frameworks such as Keras, Tensorflow, and PyTorch. We will conclude with sharing our experience from deep learning projects conducted by the SimLab Neuroscience in collaboration with the INM. |