Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Code Block
languagetext
> salloc --nodes=1 --time=00:30:00 --partition=mpp2cm2_inter

For requesting one node on the test partition of SuperMuc-NG for half an hour, please issue:

...

For production runs, please create a SLURM batch script and submit via "sbatch".

Linux Cluster

SuperMUC-NG

#!/bin/bash
#SBATCH -o /dss/dsshome1/<group>/<user>/mydir/%x.%j.out
#SBATCH -e /dss/dsshome1/<group>/<user>/mydir/%x.%j.err #SBATCH -D /dss/dsshome1/<group>/<user>/mydir #SBATCH -J <job_name>
#SBATCH --time=24:00:00
#SBATCH --clusters=mpp2=cm2
#SBATCH --partition=cm2_tiny
#SBATCH --qos=cm2_tiny
#SBATCH --nodes=1 #SBATCH --export=NONE #SBATCH --get-user-env #SBATCH --mail-type=end #SBATCH --mail-user=<email_address>@<domain> source /etc/profile.d/modules.sh module load slurm_setup
module load pymol

pymol -c PML_FILE_NAME.pml
#!/bin/bash
#SBATCH -o /dss/dsshome1/<group>/<user>/mydir/%x.%j.out
#SBATCH -e /dss/dsshome1/<group>/<user>/mydir/%x.%j.err
#SBATCH -D /dss/dsshome1/<group>/<user>/mydir
#SBATCH -J <job name>
#SBATCH --time=24:00:00
#SBATCH --partition=micro
#SBATCH --nodes=1
#SBATCH --export=NONE
#SBATCH --get-user-env
#SBATCH --mail-type=END
#SBATCH --mail-user=<email_address>@<domain>
#SBATCH --account=<project id>

module load slurm_setup
module load pymol

pymol -c PML_FILE_NAME.pml

...