Research project is (co) funded by the Slovenian Research Agency.
UL Member: Faculty of Mathematics and Physics
Code: J1-60028
Project: Development of machine learning methods for precise predictions of background processes in new physics searches at the Large Hadron Collider (LHC)
Period: 1. 1. 2025 - 31. 12. 2027
Range per year: 1,03 FTE, category: D
Head: Borut Paul Kerševan
Research activity: Natural sciences and mathematics
Project description:
The success of the (HL-)LHC physics program depends on the ability of the LHC experiments to process unprecedentedly complex data being collected at prodigious rates, to produce matching volumes of simulated data, and to continue to perform timely high-precision measurements and searches. In the next years, especially after the upgrade of LHC to the HL-LHC with its substantially larger data volumes, the demands on the measurement precision will increase substantially. The analysis power in new physics searches, where we look for new physics ‘signals’ as excess of data over the known ‘background’ processes critically depends on our capability to accurate predict the backgrounds both in terms of rate (yield, normalization) and distributions of kinematic observables of interest. Identifying rare phenomena in complex events of high energy collisions in LHC upgrades thus progressively becomes extremely difficult and requires an excellent control of systematic effects, which can come only from highly performant selection algorithms, reducing background contamination and a very precise prediction and uncertainty evaluation of the remaining background processes. In the “noisy” environment of the (HL-)LHC, this is a considerable challenge. The optimal state-of-the-art methodology to employ is based on Machine Learning (ML) techniques. Their development is however based on commercial applications and the requirements of what is 'good enough’ is very much different in particle physics. Furthermore, the precise estimation of uncertainties, essential for the application in particle physics measurements, is generally absent in commercial applications.
The project will thus focus on designing a new generation of scientific computing tools which will overhaul the algorithms used in HEP towards a more generalized utilization of Machine Learning (ML), in particular Deep Learning (DL) and apply these to the precise background determination in physics analyses at the ATLAS experiment at the LHC.