Container Technology and AI Material

Overview

Since the introduction of Docker back in 2013, container technology has become the industry standard for software packaging, distribution, and deployment.

Creating a container consists in bundling an application, its runtime, dependencies, libraries, settings etc. in one single unit that can later run independently of the underlying infrastructure. Unlike virtual machines, containers are lightweight and yield higher performances while providing greater versatility and interoperability. As containers accommodate an easy, safe, reliable, and scalable way to run applications and pipelines, they are an attractive candidate for HPC and AI workloads.

With this module, we will showcase the most enticing features and niceties offered by containers. Not only will we explore their history and implementations, but we will also dive into actual and advanced uses with a particular emphasis on AI tasks, reproducible biomedical pipelines, and automated workflows.

Participants will roll up their sleeves and get their hands on the LRZ Compute Cloud to set containers in action. By the end of the course, participants will be able to transfer their knowledge and experience to their specific use-cases and requirements.

Available Material

ContentLink
HandoutHandout.pdf (1.2 MB)
PresentationPresentation (9.7 MB)
Material for Exercices

Exercices.zip (3.3 MB)

Additional base imagesmignon, gioconda, lrz
Additional wav files

Corresponding person

Questions and comments can be addressed at Florent Dufour: florent@lrz.de