...
pw.x -np 1 / or bands.x inputfilename
Running parallel Q-ESPRESSO with script job on Linux-Cluster and SuperMUC-NG
Linux-Cluster
| SuperMUC-NG |
#!/bin/bash
#SBATCH -o /home/cluster/group/user/job_name./%x.%j.%N.out
#SBATCH -D /home/cluster/group/user/job_name.%j.out
./ #SBATCH -J <job_name>
#SBATCH --get-user-env
#SBATCH --clusters=mpp1
=cm2_tiny #SBATCH --partition=cm2_tiny #SBATCH --nodes=1-1
#SBATCH -- ntasks=32
cpus-per-task= 28 #SBATCH --mail-type=end
#SBATCH --mail-user=<email_address>@<domain>
#SBATCH --export=NONE #SBATCH --time=24:00:00
module load slurm_setup
cd mydire <link to your input>
module load quantum-espresso
mpiexec -n 8 pw.x #/ or bands.x my-job.inpor
srun -n 8 -i my-job.inp pw.x | #!/bin/bash # Job Name and Files (also --job-name) #SBATCH -J jobname #Output and error (also --output, --error): #SBATCH -o ./%x.%j.out #SBATCH -e ./%x.%j.err #Initial working directory (also --chdir): #SBATCH -D ./ #Notification and type #SBATCH --mail-type=END #SBATCH --mail-user=insert_your_email_here # Wall clock limit: #SBATCH --time=24:00:00 #SBATCH --no-requeue #Setup of execution environment #SBATCH --export=NONE #SBATCH --get-user-env #SBATCH --account=insert your_projectID_here #SBATCH --partition=insert test, micro, general, large or fat
module load slurm_setup cd mydire <link to your input> module load quantum-espresso
mpiexec -n 8 pw.x / or bands.x my-job.inp or
srun -n 8 -i my-job.inp pw.x
|
Dokumentation
...
ls -C $QESPRESSO_DOC
ls -C $QESPRESSO_BASE
- Other format of the documentation and more information may be found on the Quantum-ESPRESSO-webpage
Examples
After loading the environment appropriate module, the Quantum-ESPRESSO examples:
ls -C $QESPRESSO_EXAMPLES
Support
...