Coarray Fortran on LRZ's HPC systems
How to compile and run Coarray Fortran programs on LRZ HPC systems.
Coarrays provide a parallel programming model based on the PGAS concept that is specified in the Fortran standard.
Availability on LRZ HPC systems
Coarrays are supported by the following compilers on the LRZ HPC systems:
- Intel's Fortran compiler (ifort)
- GCC Fortran compiler (gfortran)
- NAG Fortran compiler (nagfor)
ifort and gfortran support shared-memory and distributed memory processing for coarray code. The nagfor compiler supplies a shared-memory implementation.
Setting up the environment
The setup described here applies for the CoolMUC-2 systems as well as SuperMUC-NG. For CoolMUC-3 (lxlogin8 and KNL nodes) we currently don't offer support.
This is done by loading an appropriate stack of environment modules:
Intel compiler | GCC gfortran | NAG Fortran |
(use default environment) | module load gcc/10 module load opencoarrays/2.7.1-gcc10-impi | module unload intel intel-mpi module load nag/7.1 |
Note that it may be necessary to unload other conflicting compiler or MPI modules before the CAF module stack can be loaded. The above is designed to work with the LRZ default environment. Setting up the environment should be done for both compilation and execution.
Compiling and linking
Compiler and mode | Compiler driver | Compilation switch | Linkage switch |
Intel compiler (shared memory) | ifort / mpif90 | -coarray | -coarray |
Intel compiler (distributed memory) | ifort / mpif90 | -coarray | -coarray=distributed |
GCC gfortran | caf | (none needed) | (none needed) |
NAG Fortran | nagfor | -coarray | -coarray |
The mpif90 driver can be used for Intel Fortran if the coarray code also uses MPI. For GCC gfortran this is not necessary because the driver script already includes the necessary functionality.
For the NAG compiler, using coarrays cannot be done at the same time as using OpenMP.
Execution of coarray programs
Intel Fortran on a shared memory system
The following sequence of commands executes the program "my_coarray_program.exe" using 8 images that run on a single computational node.
export FOR_COARRAY_NUM_IMAGES=8
./my_coarray_program.exe
Intel Fortran on a distributed memory system
The following sequence of commands executes the program "my_coarray_program.exe" using 8 images that run on a single computational node.
export FOR_COARRAY_CONFIG_FILE=./my_coarray_program.conf
./my_coarray_program.exe
The configuration file my_coarray_program.conf needs to contain a line with at least the following entries:
SLURM jobs | LoadLeveler jobs |
-n <# of images> ./my_coarray_program.exe | -n <# of images> ./my_coarray_program.exe |
This information is supplied to the MPI startup mechanism. Other Intel MPI options like -genvall or -ppn (number of tasks per node) etc. can also be added, but this is only necessary for specialized setups. The executables' name and its command line arguments are taken from this configuration file, so care must be taken to keep these consistent.
GCC gfortran
The Intel MPI mpiexec command can be used to start up the coarray executable. Alternatively, the Opencoarrays cafrun wrapper can be used.
NAG Fortran
An executable built with coarray support can be directly executed after setting
export NAGFORTRAN_NUM_IMAGES=8
(here for 8 images). This variable overrides an existing compiler setting of -num_images=#.
Interoperation of MPI/OpenMP and Coarrays
Intel Fortran and Gfortran
For both compilers, it should be possible to also use MPI calls in coarray programs (with some care to avoid destructive interference between the programming models). There is a one-to-one mapping of MPI ranks and image indices i.e. each MPI rank corresponds to exactly one coarray image.
It is also expected that OpenMP can be used in coarray programs (as long as no coindexed access happens inside threaded regions), particularly if such use is limited to lower-level (serial) libraries. The usual steps for hybrid execution need to be taken to assure proper pinning of both tasks/images and threads.
NAG Fortran
Currently no support for MPI in addition to Coarrays is available.
Coarray Documentation
- Summary of the coarray feature (as covered in the Fortran 2008 standard) by John Reid.
- Andy Vaught's coarray compendium
- Intel Fortran compiler documentation, available from the Intel web site