Particle Physics

Particle Physics generates a large amount of data from experiments. With the large amount of data that is generated, the group utilises a number of methods to analyse the large quantity of data. The methods include trigger algorithms (development of algorithms for real-time analysis of data, to decide which data to store for future analysis), analysis of large datasets on GridPP and the use of state-of-the-art machine learning algorithms for data analysis. 

Particle Physics Research Group

Data Science plays a large role in the Manchester Particle Physics Group's work in a number of international experiments that search for new fundamental particles and new types of particle interactions.

The Large Hadron Collider (LHC) at CERN provides approximately one billion proton-proton collisions every second. From this huge dataset, there are a number of significant data intensive challenges in searching for the signatures of new physics. First, even a single detector readout produces 1MB of data, and this severely limits the rate at which events can be stored to disk. The data volume at each experiment is reduced to approximately 1GB/s using real-time fast-analysis algorithms that are collectively referred to as the trigger. The advent of the High Luminosity LHC in ~2025 will see the initial data rate increase by a factor of 10. The efficient design of the trigger algorithms is critical, as events that are not selected by the trigger are lost forever. Second, the analysis of the selected data requires dedicated computing frameworks. The LHC data are propagated to the Worldwide LHC Computing Grid (WLCG), which is a distributed computing framework that allows data to be accessed by any member of an experiment at any time. This `high throughput' model requires an ever increasing amount of CPU and disk resources. Finally, the analysis of the stored data involves searching through tens of millions of events for a handful of interest. Often, this data analysis is carried out using multivariate analysis techniques in order to more efficiently distinguish event topologies of interest.

The DUNE project at Fermilab is one of the two flagship next-generation neutrino experiments. It will consist of two neutrino detectors placed in the world’s most intense neutrino beam, these detectors will enable scientists to search for new subatomic phenomena and potentially transform our understanding of neutrinos and their role in the universe. Exploiting the full potential of the state-of the-art Liquid-Argon detector requires highly sophisticated pattern recognition and event reconstruction techniques, including computer vision, machine learning and automated image processing.

The Manchester group also leads an STFC-funded Centre For Doctoral Training in Data Intensive Science with the aim of developing future leaders in the data intensive methodologies underpinning both modern science and Data Science industries. Research projects undertaken as part of the CDT will focus on data intensive aspects of cutting edge research across Astrophysics, Particle Physics and Nuclear Physics.

More information can be found at the Particle Physics webpage: http://www.hep.man.ac.uk/