High Performance Computing
<< Zurück zur Dokumentationsstartseite
High Performance Computing

Forgot your Password? click here
Add new user (only for SuperMUC-NG)? click here
Add new IP(only for SuperMUC-NG)? click here
How to write good LRZ Service Requests? click here
How to setup two-factor authentication (2FA) on HPC systems? click here
New: Virtual HPC Lounge to ask question and get advice. Every Wednesday, 2:00pm - 3:00pm
System Status (see also: Access and Overview of HPC Systems)
GREEN = fully operational YELLOW = operational with restrictions (see messages below) RED = not available = see messages below
login nodes: skx.supermuc.lrz.de UP | |
Partitions/Queues: | |
login nodes: pvc.supermuc.lrz.de UP | |
Partitions/Queues: | |
HOME WORK SCRATCH DSS DSA SuperMUC-NG Phase 2 only: DAOS | |
Further documentation | |
| |
Submit an Incident Ticket for the SuperMUC-NG Add new user? click here Add new IP? click here Questions about 2FA on SuperMUC-NG? click here |
CoolMUC-4 | |||
login nodes: cool.hpc.lrz.de | UP | ||
serial partition serial_std serial partition serial_long parallel partitions cm4_ (tiny | std) interactive partition: cm4_inter | MOSTLY UP UP UP UP |
| |
teramem_inter | UP | ||
Housing Clusters (Access restricted to owners/users) | |||
kcs | PARTIALLY UP |
| |
biohpc | MOSTLY UP |
| |
hpda | UP |
| |
File Systems | |||
HOME | UP |
| |
Details: | |||
Compute Cloud and | ||
---|---|---|
Compute Cloud: (https://cc.lrz.de) detailed status: Status | UP | |
AI Systems | UP | |
Details: | ||
DSS Storage systems |
---|
For the status overview of the Data Science Storage please go to https://doku.lrz.de/display/PUBLIC/Data+Science+Storage+Statuspage |
Messages
see also: Aktuelle LRZ-Informationen / News from LRZ
Messages for all HPC System |
The new ANSYS Release 2025.R2 has been installed, tested and rolled-out on SuperMUC-NG Phase 1 and CoolMUC-4. ANSYS 2025.R2 has been made the new default ANSYS Release on those systems and the LRZ documentation has been updated in due course. |
The new Siemens PLM Release of StarCCM+ 2025.2.1 (= 2506.0001 = v20.04.008) has been installed, tested and rolled-out on SuperMUC-NG Phase 1 and CoolMUC-4. StarCCM+ 2025.2.1 has been made the new default StarCCM+ Release on those systems and the LRZ documentation has been updated in due course. At the same time the old StarCCM+ versions 2023.x.1, x=1,2,3 have been deprecated and removed from the LRZ HPC systems. |
Messages for SuperMUC-NG |
Phase 2: Start of user operationStart of user operation has been anounced. All SuperMUC-NG Phase 1 projects can now access Phase 2 as well and have an initial budget of 900 GPU hours for testing. In the startup phase of user operation, we will only bill 25% of the actual compute time used. This factor will be gradually increased to 100% over the next few months; changes will be announced in a timely fashion. Regular compute time applications requesting resources on Phase 2 can now be submitted via the GCS-JARDS portal https://jards.gauss-centre.eu/gcshome/ and a GCS Large Scale Call is coming up in July. |
Messages for Linux Clusters |
New quick reference. The first release of the CoolMUC Cheat Sheet has been published. |
Messages for Compute Cloud and other HPC Systems |
The AI Systems (including the BayernKI and MCML system segments) will undergo a maintenance procedure between September 8th and 10th, 2025. On these days, the system will not be available to users. Normal user operation is expected to resume during the course of Wednesday, September 10th. The LRZ AI Systems have to undergo a short maintenance early next week. For this, the system will be drained over the weekend and start-up of new jobs will be delayed until after the maintenance. We expect the actual downtime to not exceed more than 10 minutes. The AI Systems (including the BayernKI and MCML system segments) will undergo a maintenance procedure between May 19th and 21st, 2025. On these days, the system will not be available to users. Normal user operation is expected to resume during the course of Wednesday, May 21st. |
HPC Services
Attended Cloud Housing |
More Links