Summary: Recent estimates of network capacity requirements for the HL-LHC era indicate that these cannot be met through technology evolution and price/performance improvements alone within a constant budget. An in-depth consideration of HL-LHC Computing Model is thus needed, and an R&D program to formulate, design and prototyping of the new Model is recommended. This program could take...
I summarize computational needs for determination of parton distribution functions at (N)NNLO accuracy. Our experience with the latest CT18 global analysis of NNLO PDFs indicates the need for the benchmarked infrastructure for accurate and fast determination of PDFs from QCD data. Reduction of the current PDF uncertainties to meet the targets of the HL-LHC EW precision program and BSM searches...
https://www.snowmass21.org/docs/files/summaries/CompF/SNOWMASS21-CompF5-001.pdf
In the pipeline from detector to published physics results, the last step, "end-user analysis," is the most diverse. It can even be hard to discover what tools are being used, since the work is highly decentralized among students and postdocs, many of whom are working from their home institutes (or their homes).
However, GitHub offers a window into CMS physicists' analysis tool preferences....
The computing, storage and communications challenges of the HL LHC era will extensively use emerging technologies which are currently in various stages of conception and pre-specification, so they are not yet on the computing and more broadly, the experimental roadmap. I will briefly introduce the physics and technology barriers in terms of computational and storage nanoscale feature sizes,...
We give a very short outlook on the computational aspects of dynamical simulations for the study of the Frezzotti-Rossi model of elementary particle mass generation. Having recently demonstrated via lattice simulations that the non-perturbative mechanism exists, we now plan to investigate the compelling theoretical case that within this framework, we will be able to relate all elementary...
In this lightning talk, I will introduce the GAMBIT (Global and Modular BSM Inference Tool) framework, a tool for doing global fits of particle physics models to a range of experimental results, including those from colliders, astrophysical and terrestrial dark matter searches, cosmology, neutrino experiments, and precision measurements. I will also briefly discuss the fits that have been...
We present a set of techniques studied by the ExaTrkX collaboration for classification and regression of large scale high energy physics data. Using graph structures and geometric machine learning, we observe excellent performance with particle tracking algorithms on silicon trackers and high-granularity calorimeters for HL-LHC, as well as LArTPCs for neutrino experiments. Promising future...
The strong and growing role of machine learning (ML) in particle physics is well established and appropriate given the complex detectors and large data sets at the foundational layer of our science. Increasingly, Physics departments are offering curricula to their undergraduate and graduate students that focus on the intersection of data science, machine learning and physics. In this talk, we...