HideElements | ||||||
---|---|---|---|---|---|---|
|
<< Zurück zur Dokumentationsstartseite
Lrz box | ||||
---|---|---|---|---|
|
Forgot your Password? click here
Add new user (only for SuperMUC-NG)? click here
Add new IP(only for SuperMUC-NG)? click here
How to write good LRZ Service Requests? click here
System Status (see also: Access and Overview of HPC Systems)
= fully operational Status colour Green
= operational with restrictions (see messages below) Status colour Yellow
= not availableMaintenance Feb 1-5 on SuperMUC and Feb 3-5 on the Linux Cluster Status colour Red
Höchstleistungsrechner (SuperMUC-NG) | |||||
System: |
|
|
| |||||
login nodes: skx.supermuc.lrz.de |
|
|
| |||||
archive nodes: skx-arch.supermuc.lrz.de |
|
|
| |||||||||||
File Systems |
|
|
|
|
|
|
|
|
| |||||
Partitions/Queues: fat, test |
|
|
|
|
| |||||
Globus Online File Transfer: |
|
|
| ||
Detailed node status |
Details:
| ||
Submit an Incident Ticket for the SuperMUC-NG Add new user? click here Add new IP? click here |
Linux Cluster | |||||||||||||||||||||||||||||
CoolMUC-2 | |||||||||||||||||||||||||||||
lxlogin(1,2,3,4).lrz.de |
| ||||||||||||||||||||||||||||
serial partitions: serial |
| ||||||||||||||||||||||||||||
parallel partitions cm2_(std,large) |
| ||||||||||||||||||||||||||||
cluster cm2_tiny |
| ||||||||||||||||||||||||||||
interactive partition: cm2_inter |
| ||||||||||||||||||||||||||||
c2pap |
| ||||||||||||||||||||||||||||
CoolMUC-3 lxlogin(8,9).lrz.de parallel partition: mpp3_batch interactive partition: mpp3_inter |
| ||||||||||||||||||||||||||||
teramem, ivymuc, kcs | |||||||||||||||||||||||||||||
lxlogin10.lrz.de ivymuc teramem_inter kcs |
| ||||||||||||||||||||||||||||
File Systems HOME |
| File Systems HOME, DSS SCRATCH |
| ||||||||||||||||||||||||||
: | click here|||||||||||||||||||||||||||||
Details: | |||||||||||||||||||||||||||||
Compute Cloud and | ||||||||
---|---|---|---|---|---|---|---|---|
Compute Cloud: (https://cc.lrz.de) detailed status and free slots: https://cc.lrz.de/lrz |
|
LRZ AI Systems |
|
Status | ||||
---|---|---|---|---|
|
Status | ||||
---|---|---|---|---|
|
RStudio Server |
|
|
| ||
Details: |
-
The Compute Cloud will be shut down due to a planned power outage on Tue afternoon, February 2nd. Operation is expected to resume Friday, February 5th, 2021. For further details, please see here.
This downtime will also affect RStudio ServerMessages for SuperMUC-NG |
---|
Due to the job processing backlog and the current job profile, no reservations or priorizations can be granted. |
A maintenance has been scheduled for February 1-5. See details. |
Please see the upcoming Online Courses and Workshops provided by LRZ.
The new version 2021 of Abaqus (FEM software by Dassault Systems) has been installed on Linux Clusters which are running SLES15 operatig system (i.e. CM2 and CM2_tiny). New corresponding module files are provided. The new version abaqus/2021 has been made the new default Abaqus module.
Due to the very limited use of MSC products or MSC Licenses on High-Performance Systems at LRZ, the MSC software support including installations and license server will be taken off by September 2021. Please do write us a ticket if you want to communicate on this matter with a subject MSC via https://servicedesk.lrz.de/.
The announced maintenance interval has been somewhat shifted to the time frame February 3-5. Please read the details on the impact for the different cluster segments.
Archive nodes update |
The new ANSYS Software Release, Version 2022.R1 has been installed and provided on SuperMUC-NG. For details please refer to the correspondig announcement: |
The Energy Aware Runtime (EAR) has been reactivated. Please be aware that this may have an impact on job processing times. |
Please note that WORK/SRATCH on SuperMUC-NG exhibit currently possible performance degradation under heavy I/O load. Take this into account when planning your job runtimes. |
The new hpcreport tool is now available to check job performance and accounting on SuperMUC-NG. Please check out |
Messages for Linux Clusters |
---|
SCRATCH is now fully online again. While we expect older data that were temporarily inaccessible to be fully available again, data that were created in the last few days before the problems started might be corrupt and need to be renewed from tape archive (if one exists) or recreated. |
The new ANSYS Software Release, Version 2022.R1 has been installed and provided on the LRZ Linux Cluster systems (CM2, CM3 and RVS systems). For details please refer to the correspondig announcement: |
The new release of Abaqus, Version 2022 (Dassault Systems Software) has been installed on both Linux Clusters CoolMUC-2 / CoolMUC-3 as well as on the RVS systems. The Abaqus documentation has been updated. |
The new release of SimCenter StarCCM+, Version 2021.3.1 (Siemens PLM Software) has been installed and provided on the LRZ HPC systems (CM2, CM3, SNG and RVS systems). For details please see the correspondig announcement: |
There are 4 new Remote Visualization (RVS_2021) nodes available in a friendly user testing period. Nodes are operated under Ubuntu OS and NoMachine. For more details please refer to the documentation. |
Messages for Cloud and other HPC Systems |
---|
The LRZ AI and MCML Systems are back in operation as the maintenance procedure planned from January 7thto January 11th is completed. The RStudio Server service at LRZ was decommissioned. For a replacement offering please see Interactive Web Servers on the LRZ AI Systems and, more generally, LRZ AI Systems. |
HPC Services
Attended Cloud Housing |