22–26 Jan 2018
Bormio, Italy
Europe/Berlin timezone

The latest developments in preparations of the LHC community for the computing challenges of the High Luminosity LHC

22 Jan 2018, 17:42
3m
Bormio, Italy

Bormio, Italy

Poster Applications and Instrumentations Monday Afternoon

Speaker

Dr Dagmar Adamova (Nuclear Physics Institute AS CR)

Description

After the LHC community successfully completed Run 1, the capacity of the Worldwide LHC Computing Grid (WLCG) became the limiting factor in the processing of ever growing volumes of data produced from LHC collisions. During the last five years the LHC community launched a number of activities to increase computing performance and optimize usage of available resources. These activities are particularly important in preparation for Run 3 (2021-2023) and Run 4 (2026-2029), the era of the High Luminosity LHC, because mainstream technology evolution might fall short by up to a factor 10 with respect to WLCG needs. The endeavours must constantly adapt to the new technologies; they concentrate on redesigning the computing models of the LHC experiments, improving the efficiency of the data processing chains, adaptation of software to fast and/or cheap CPU architectures, and increased use of diverse resources like private and public clouds or high performance computing (HPC) facilities. WLCG was built during a time when there was no experience with or example of such an infrastructure from industry or elsewhere. This situation changed during Run 1 when the global internet and computing industry began to provide on-demand services and developed tools and solutions which are of interest also to WLCG. The latest strategies to increase the WLCG performance are therefore also concerned with utilization of tools provided from outside the LHC community: use of commercial cloud services for LHC data simulation and processing; formation and exploitation of data lakes; use of popular data mining and analytics pipelines. The HEP Software Foundation was formed to try and orchestrate the processes of software transformation toward higher efficiency and adaptation to new computing technologies. The need for essential changes and external expertise was recognized: for example, the use of Python-based notebooks for high-level interactive analysis, or the integration of Machine Learning in reconstruction. There are opportunities for more commonality across experiments and there is an increasing collaboration with other big-data projects like the Square Kilometer Array (SKA). Important steps forward are also expected from the continued collaboration with IT industry through CERN openlab. In this contribution we will present the latest innovative strategies of the LHC community to get prepared for the computing challenges of the next ten years and especially for the demands of the High Luminosity LHC.

Primary authors

Dr Dagmar Adamova (Nuclear Physics Institute AS CR) Dr Maarten Litmaath (CERN)

Presentation materials