Physicists at the University of Warwick are among scientists developing vital software to exploit the large data sets collected by the next-generation experiments in high energy physics (HEP), predominantly those at the Large Hadron Collider (LHC).
Over the years, the existing code has struggled to meet rising output levels from large-scale experiments.
The new and optimised software will have the capability to crunch the masses of data that the LHC at CERN and next-generation neutrino experiments such as DUNE and Hyper-Kamiokande will produce this decade.
This is the first time a team of UK researchers have been funded to develop a software-based project by the Science and Technology Facilities Council (STFC).
The University of Warwick is involved in three of the five work packages for the project, which will also involve collaboration with Monash University as part of the Monash-Warwick Alliance.
Dr Ben Morgan from the University of Warwick Department of Physics, who is principal investigator for the Warwick contribution, said: "Software and computing are critical tools for scientific research with the ever increasing amount of data that experiments are recording, especially so in high energy physics to maximise the measurements we can make and the discoveries these may lead to. Our work at the University of Warwick in cooperation with Monash University, CERN, and other stakeholders on reducing the CPU time required to simulate particle collisions and how our detectors record them will address one of the most complex and computationally expensive areas for HEP.
"Increasing the efficiency of software used in these areas will allow us to maximise the science we can do whilst reducing our resource footprint, for HEP as well as fields and industries that utilise these software packages for their research such as medical and space physics."
Professor Timothy Gershon, of the University of Warwick Department of Physics, said: "We will be working to reduce the amount of CPU time needed to produce simulations of the physics processes occurring in LHC collisions - as the rate at which data are collected increases, this is becoming a major bottleneck in our ability to complete measurements. The link between experts in the relevant software packages at Monash and Warwick will be crucial to allow us to "accelerate" this aspect of LHC computing."
Professor Davide Costanzo, the Principal Investigator (PI) based at the University of Sheffield, said:
"Modern particle physics experiments are capable of producing an exabyte of real and simulated data every year. Modern software tools are needed to process the data and produce the physics results and discoveries that we, particle physicists, are so proud of. This is central to the exploitation of particle physics experiments in the decades to come."
If scientists used everyday computers to store one exabyte of data they would need almost one million powerful home computers to do it.
Without a software upgrade the need of computing resources for the LHC would be expected to grow six times in size in the next decade. This is not only too expensive in hardware costs and software inefficiencies, but would also imply an increased consumption of electricity. The more efficient software the team will develop will help reduce the usage of computing resources and the carbon footprint of data centre across the world.
The project is to ensure scientists can exploit the physics capabilities of future HEP experiments, while keeping computing costs affordable.
In creating this novel solution, it will lead to the development of skills transferable to the overall UK economy.
Conor Fitzpatrick, the Deputy PI and UKRI Future Leaders Fellow based at the University of Manchester, said:
"The data rates we expect from the LHC upgrades and future experimental infrastructure represent an enormous challenge: if we are to maximally exploit these state-of-the art machines, we need to develop new and cost-effective ways to collect and process the data they will generate.
"This challenge is by no means unique to scientific infrastructure. Similar issues are being faced by industry and society as we become increasingly reliant on real-time analysis of big data in our everyday lives. It is very encouraging that STFC is supporting this effort, which brings together and enhances the significant software expertise present in the UK to address these challenges."
The large upgrade will be organised into a set of activities that the UK has leading expertise and experience within - involving work on data management, event generation, simulation, reconstruction and data analysis.
The partners involved in the upgrade include UK universities and STFC's Rutherford Appleton Laboratory (RAL) and in cooperation with CERN and other international stakeholders. Staff at RAL are involved in the data management work package and in the reconstruction work package, developing tools for the tracking of LHC experiments.
The team aim to engage the community in the development of the software needed by particle physics experiments, with workshops planned twice every year.
The project's key milestone for 2024 is to evolve from proof-of-concept studies to deployment, ready for the start of data-taking at the HL-LHC and next generation neutrino experiments, which are expected to dominate the particle physics scene in the second half of the decade.