VI-HPS Tuning Workshop 2021



LRZPRACEVI-HPS-new-small


© LRZ 2021: Participants of the 40th VI-HPS Tuning Workshop

PRACE Webpage

See https://events.prace-ri.eu/event/1207/ 

VI-HPS Webpage

See https://www.vi-hps.org/training/tws/tw40.html

Slides

Day1

Day 2


Day 3

Day 4

Day 5

PRACE Survey

Please fill out the PRACE online survey under

https://events.prace-ri.eu/event/1207/surveys/826

This helps us and PRACE to

  • increase the quality of the courses,
  • design the future training programme at LRZ and in Europe according to your needs and wishes,
  • get future funding for training events.

Information on the Linux Cluster

https://doku.lrz.de/display/PUBLIC/Linux+Cluster

https://doku.lrz.de/display/PUBLIC/IvyMUC

The IvyMUC System

(c) LRZ


Login to IvyMUC

Login under Windows:

Login under Mac:

Login under Linux:

  • Open xterm
  • ssh -Y lxlogin10.lrz.de -l username
  • Use user name and password (provided by LRZ staff)

How to use the IvyMUC System

Reservation is only valid during the workshop, for general usage on our Linux Cluster remove the "--reservation=hhps1s21_workshop"

  • Submit a job:
    sbatch --reservation=hhps1s21_workshop job.sh
  • List own jobs:
    squeue -M ivymuc –u hpckurs??
  • Cancel jobs:
    scancel -M ivymuc jobid
  • Interactive Access:
    module load salloc_conf/ivymuc
    salloc --nodes=1 --time=02:00:00 --reservation=hhps1s21_workshop --partition=ivymuc_batch

    or: srun --reservation=hhps1s21_workshop --pty bash

Intel Software Stack:

  • The Intel software stack is automatically loaded at login. The Intel compilers are called icc (for C), icpc (for C++) and ifort (for Fortran). They behave similar to the GNU compiler suite (option –help shows an option summary). For reasonable optimization including SIMD vectorization, use options -O3 -xavx (you can use -O2 instead of -O3 and sometimes get better results, since the compiler will sometimes try be overly smart and undo many of your hand-coded optimizations).
  • By default, OpenMP directives in your code are ignored. Use the -qopenmp option to activate OpenMP.
  • Use mpiexec -n #tasks to run MPI programs. The compiler wrappers' names follow the usual mpicc, mpifort, mpiCC pattern.

Example OpenMP Batch File

#!/bin/bash
#SBATCH -o /dss/dsshome1/0D/hpckurs10/ivymuc.%j.%N.out
#SBATCH -D /dss/dsshome1/0D/hpckurs10/
#SBATCH -J ivytest
#SBATCH --clusters=ivymuc
#SBATCH --nodes=1
#SBATCH --get-user-env
#SBATCH --reservation=hhps1s21_workshop
#SBATCH --time=02:00:00
 module load slurm_setup                                                                                                

 export OMP_NUM_THREADS=16
./myprog.exe


Example MPI Parallel Batch File ( 2 nodes with 32 MPI tasks per node)


#!/bin/bash
#SBATCH -o /dss/dsshome1/0D/hpckurs10/ivymuc.%j.%N.out
#SBATCH -D/dss/dsshome1/0D/hpckurs10
#SBATCH -J ivytest
#SBATCH --clusters=ivymuc
#SBATCH --nodes=2
#SBATCH --ntasks-per-node=32
#SBATCH --get-user-env
#SBATCH --reservation=hhps1s21_workshop
#SBATCH --time=02:00:00

module load slurm_setup
mpiexec -n $SLURM_NTASKS  /dss/dsshome1/0D/hpckurs10/c/mpiprog.exe

Further Batch File Examples

See https://doku.lrz.de/display/PUBLIC/Example+parallel+job+scripts+on+the+Linux-Cluster

Reservation

slurm2:~/slurm_reservations # mslurm ivymuc scontrol show reservation
ReservationName=hhps1s21_workshop StartTime=2021-06-14T09:00:00 EndTime=2021-06-18T18:00:00 Duration=4-09:00:00
Nodes=ivynsr0r1is[01-03,05-32] NodeCnt=31 CoreCnt=496 Features=(null) PartitionName=ivymuc_batch Flags=OVERLAP,IGNORE_JOBS  TRES=cpu=992
   Users=hpckurs00,hpckurs01,hpckurs02,hpckurs03,hpckurs04,hpckurs05,hpckurs06,hpckurs07,hpckurs08,hpckurs09,hpckurs10,hpckurs11,hpckurs12,hpckurs13,hpckurs14,hpckurs15,hpckurs16,hpckurs17,hpckurs18,hpckurs19,hpckurs20,hpckurs21,hpckurs22,hpckurs23,hpckurs24,hpckurs25,hpckurs26,hpckurs27,hpckurs28,hpckurs29,hpckurs30,hpckurs31,hpckurs32,hpckurs33,hpckurs34,hpckurs35,hpckurs36,hpckurs37,hpckurs38,hpckurs39,hpckurs40,hpckurs41,hpckurs42,hpckurs43,hpckurs44,hpckurs45,hpckurs46,hpckurs47,hpckurs48,hpckurs49 Accounts=(null) Licenses=(null) State=INACTIVE BurstBuffer=(null) Watts=n/a