NERSC and NVIDIA will be offering a hybrid Deep Learning at Scale Training from Monday, March 3 to Tuesday, March 4.
National Energy Research Scientific Computing Center (NERSC) is a national lab with similar HPC hardware to NSF NCAR's Derecho.
This training will help users explore distributed training for deep learning models on high-performance computing systems (specifically Perlmutter). The training will focus on building a large-scale deep learning model applied to transformers for weather forecasting and walk users through profiling tools and performance optimization on a single GPU, then scaling to multi-GPU nodes.
Hurry! Only virtual registration remains available at this point. The training is primarily targeting undergraduate students through staff and faculty.
------------------------------
Shira Feldman
UCAR, UCP and NSF NCAR
------------------------------