Conveners
Computing, Analysis Tools, and Data Handling: Joint session with Particle Detectors, Monday afternoon
- Matthew Judah (Colorado State University)
Computing, Analysis Tools, and Data Handling: Tuesday afternoon
- Aristeidis Tsaris (Fermilab)
Computing, Analysis Tools, and Data Handling: Wednesday morning
- Salman Habib (Argonne National Laboratory)
Computing, Analysis Tools, and Data Handling: Wednesday afternoon
- Pengfei Ding (Fermilab)
Computing, Analysis Tools, and Data Handling: Thursday morning
- Andrew Norman (Fermilab)
Computing, Analysis Tools, and Data Handling: Thursday Afternoon
- Kaushik De (Univ. of Texas at Arlington)
192.
The Calorimeter Global Feature Extractor (gFEX) for the Phase-I Upgrade of the ATLAS experiment
Giordon Stark
(University of Chicago)
31/07/2017, 13:30
Computing, Analysis Tools and Data Handling
Presentation
The ATLAS Level-1 calorimeter trigger is planning a series of upgrades in order to face the challenges posed by the upcoming increase of the LHC luminosity. The upgrade will benefit from the new front end electronics for parts of the calorimeter which provide the trigger system with digital data with a tenfold increase in granularity. The Global Feature Extractor (gFEX) module is one of the...
Alexander Tuna
(Harvard University)
31/07/2017, 13:47
Particle Detectors
Presentation
The New Small Wheel (NSW) is a major upgrade to the muon spectrometer of
the ATLAS experiment, which will be installed in 2019-2020. The NSW will
comprise both sTGC and Micromegas detectors. One of the major goals on the
upgrade is fast, precise muon reconstruction in hardware to allow
triggering on single muon events in high pileup environments. This talk
presents the status of the NSW...
Mr
Harish Potti
(University of Texas at Austin)
31/07/2017, 14:04
Computing, Analysis Tools and Data Handling
Presentation
In this talk, a new method of calibrating the instantaneous luminosity of the ATLAS experiment with $Z\to \mu \mu $ process is presented. At the designed center-of-mass energy of the LHC, cross-section of $Z\to \mu \mu $ process is known to a very high precision and has a very good production rate (~1000 events/minute), which makes it suitable for luminosity measurement. Leading systematic...
Dr
Tomonari Miyashita
(California Institute of Technology)
31/07/2017, 14:21
Computing, Analysis Tools and Data Handling
Presentation
The Mu2e experiment at Fermilab aims to measure the charged-lepton flavor violating neutrinoless conversion of a negative muon into an electron, producing a monochromatic electron with an energy slightly below the rest mass of the muon (104.97 MeV). We expect to set a limit on the ratio between the muon conversion and capture rate of 6.7 × 10^−17 at 90% CL in three years of running using a...
Catrin Bernius
(SLAC)
31/07/2017, 14:38
Computing, Analysis Tools and Data Handling
Presentation
The ATLAS trigger has been used very successfully for the online event
selection during the first part of the second LHC run (Run-2) in 2015/16
at a center-of-mass energy of 13 TeV. The trigger system is composed of
a hardware Level-1 trigger and a software-based high-level trigger; it
reduces the event rate from the bunch-crossing rate of 40 MHz to an
average recording rate of about 1...
Heather Russell
(McGill University)
31/07/2017, 14:55
Computing, Analysis Tools and Data Handling
Presentation
The ATLAS experiment aims at recording about 1 kHz of physics
collisions, starting with an LHC design bunch crossing rate of 40
MHz. To reduce the large background rate while maintaining a high
selection efficiency for rare physics events (such as beyond the
Standard Model physics), a two-level trigger system is used.
Events are selected based on physics signatures such as the...
Benjamin Nachman
(LBNL)
01/08/2017, 13:30
Computing, Analysis Tools and Data Handling
Presentation
Deep neural networks (DNNs) have revolutionized many areas of science and technology. In this talk, we will discuss cutting edge developments in DNNs for high energy physics, using jet physics (including calorimeter showers) as an example that has attracted significant recent attention. Domain specific challenges require new techniques to make full use of the algorithms. A key focus is on...
Taritree Wongjirad
(MIT)
01/08/2017, 13:55
Computing, Analysis Tools and Data Handling
Presentation
Deep learning algorithms, which have emerged over the last decade, are opening up new ways to analyze data for many particle physics experiments. The MicroBooNE experiment, which is a neutrino experiment at Fermilab, has been exploring the use of such algorithms, in particular convolutional neural networks (CNNS). CNNs are the state-of-the-art method for a large class of problems requiring...
Dr
Alexander Radovic
(College of William and Mary)
01/08/2017, 14:20
Computing, Analysis Tools and Data Handling
Presentation
The observation of neutrino oscillations provides evidence of physics beyond the standard model, and the precise measurement of those oscillations remains an important goal for the field of particle physics. The planned DUNE experiment is set to become a leading experiment in the study of neutrino oscillations. Taking advantage of a two-detector technique, a tightly focused beam at Fermilab,...
Dr
Fedor Ratnikov
(YSDA)
01/08/2017, 14:45
Computing, Analysis Tools and Data Handling
Presentation
The LHCb detector is a forward spectrometer optimized for the reconstruction of charm- and bottom-hadron decays in LHC’s proton-proton collisions.
The need to process large amounts of data within the constraints of the data-acquisition and offline-computing resources
pushes steadily toward usage of advanced data-analysis techniques. Currently, LHCb takes data at rates significantly higher...
Jonathan Miller
(Universidad Técnica Federico Santa María)
02/08/2017, 10:45
Computing, Analysis Tools and Data Handling
Presentation
While Machine Learning algorithms have been used for decades, recent advances in Deep Convolutional Neural Networks have revolutionised the fields of computer vision and image recognition and Artificial Intelligence. Modern particle physics experiments and detectors produce data which is analogous to modern high resolution images, and we anticipate a similar revolution in particle physics....
Ms
Fernanda Psihas
(Indiana University)
02/08/2017, 11:10
Computing, Analysis Tools and Data Handling
Presentation
Deep Convolutional Neural Networks (CNNs) have been widely applied in computer vision to solve complex problems in image recognition and analysis. In recent years many efforts have emerged to extend the use of this technology to HEP applications, including the Convolutional Visual Network (CVN), our implementation for identification of neutrino events. In this presentation I will describe the...
Mr
Shaokai Yang
(university of Cincinnati)
02/08/2017, 11:35
Computing, Analysis Tools and Data Handling
Presentation
We are witnessing a revolution happening in experimental HEP based on the current innovations in deep machine learning technologies. For example, Convolutional neural networks (CNNs) have been introduced to identify particle interactions in particle tracking detectors based on their topology without the need for detailed reconstruction and outperforms currently used algorithms. We are trying...
Ms
Xiaoyue Li
(Stony Brook University)
02/08/2017, 11:55
Computing, Analysis Tools and Data Handling
Presentation
Data unfolding is a commonly used technique in the HEP community, particularly in cross-section measurements. Inspired by the deconvolution technique in digital signal processing, we propose a new unfolding method based on Wiener filter and the SVD technique. Unlike traditional unfolding techniques, the Wiener-SVD unfolding method achieves data unfolding by maximizing signal to noise ratios in...
Dr
Kate Whalen
(University of Oregon)
02/08/2017, 13:30
Computing, Analysis Tools and Data Handling
Presentation
In Run 2 at CERN's Large Hadron Collider, the ATLAS detector uses a two-level trigger system to reduce the event rate from the nominal collision rate of 40 MHz to the event storage rate of 1 kHz, while preserving interesting physics events. The first step of the trigger system, Level-1, reduces the event rate to 100 kHz with a latency of less than 2.5 μs. One component of this system is the...
Dr
Zhenbin Wu
(University of Illinois at Chicago)
02/08/2017, 13:47
Computing, Analysis Tools and Data Handling
Presentation
With the increasing luminosity and large number of simultaneous inelastic collisions per crossing (pile-up) at the LHC, it is challenging for online data selection. This talk will present the new approaches developed in the Level-1 Trigger of the CMS experiment to cope with the increasing LHC luminosity and pileup.
Dr
Wes Gohn
(University of Kentucky)
02/08/2017, 14:05
Computing, Analysis Tools and Data Handling
Presentation
A new measurement of the anomalous magnetic moment of the muon, $a_{\mu} \equiv (g-2)/2$, will be performed at the Fermi National Accelerator Laboratory. The most recent measurement, performed at Brookhaven National Laboratory and completed in 2001, shows a 3.5 standard deviation discrepancy with the standard model value of $a_\mu$. The new measurement will accumulate 21 times those...
Prof.
K.K. Gan
(The Ohio State University)
02/08/2017, 14:22
Computing, Analysis Tools and Data Handling
Presentation
The LHC has recently been upgraded to operate at higher energy and luminosity. In addition, there are plans for further upgrades. These upgrades require the optical links of the experiments to transmit data at much higher speed in a more intense radiation environment. We have designed a new optical transceiver for transmitting data at 10 Gb/s. The device consists of a 4-channel ASIC driving a...
Dr
Yuri Oksuzian
(University of Virginia)
02/08/2017, 14:38
Computing, Analysis Tools and Data Handling
Presentation
The Mu2e experiment will search for a neutrinoless muon-to-electron conversion process using a novel apparatus design that promised almost four orders of magnitude of sensitivity improvement to the current limit. An important background is caused by cosmic-ray muons faking the conversion electron signature. In order to reach the designed sensitivity, Mu2e needs to obtain a cosmic-ray veto...
Rebecca Linck
(Indiana University)
02/08/2017, 14:55
Computing, Analysis Tools and Data Handling
Presentation
The Jets-without-Jets algorithm presents a novel approach to the determination of jet observables without the need for timely jet reconstruction. Because it relies on a set of simple sums, the algorithm is well suited to the kinds of fast real-time calculation that are required for a trigger algorithm. Following the current data taking period, the global feature extractor (gFEX) will...
Prof.
Leo Piilonen
(Virginia Tech)
03/08/2017, 10:45
Computing, Analysis Tools and Data Handling
Presentation
I describe the charged-track extrapolation and muon-identification modules in the Belle II data-analysis code framework (basf2). These modules use GEANT4E to extrapolate reconstructed charged tracks outward from the Belle II Central Drift Chamber into the outer particle-identification detectors, the electromagnetic calorimeter, and the K-long and muon detector (KLM). These modules propagate...
Dr
Henry Schreiner
(LHCb)
03/08/2017, 11:07
Computing, Analysis Tools and Data Handling
Presentation
Datasets with millions of events in charm decays at LHCb have prompted the development of powerful fitting and analysis tools capable of handling unbinned datasets using GPUs and multithreaded architectures.
GooFit, the original GPU fitting program with a familiar syntax resembling classic RooFit, has undergone significant redesign and has expanded physics and computing capabilities. The...
Dr
Brian Pollack
(Northwestern University)
03/08/2017, 11:29
Computing, Analysis Tools and Data Handling
Presentation
The Bayesian Block algorithm, originally developed for applications in astronomy, can be used to improve the binning of histograms in high energy physics. Along with visual improvements, the histogram produced from this algorithm is a non-parametric density estimate, providing a description of background distributions that do not suffer from the arbitrariness of ad-hoc analytical functions....
Katherine Woodruff
(New Mexico State University)
03/08/2017, 11:51
Computing, Analysis Tools and Data Handling
Presentation
MicroBooNE is a liquid argon time projection chamber (LArTPC) neutrino
experiment that is currently running in the Booster Neutrino Beam at Fermilab. LArTPC technology allows for high-resolution, three-dimensional representations of neutrino interactions. A wide variety of software tools for automated reconstruction and selection of particle tracks in LArTPCs are actively being developed....
Dr
Burt Holzman
(FNAL)
03/08/2017, 13:30
Computing, Analysis Tools and Data Handling
Presentation
The High Energy Physics (HEP) community is facing a daunting computing challenge in the upcoming years, as upgrades to the Large Hadron Collider and new technologies such as liquid argon detectors will require vast amounts of simulation and processing. Additionally, the stochastic nature of research suggests that leveraging elastically available resources would increase efficiency and...
Dr
Kenneth Herner
(Fermilab)
03/08/2017, 13:47
Computing, Analysis Tools and Data Handling
Presentation
The FabrIc for Frontier Experiments (FIFE) project is a
major initiative within the Fermilab Scientific Computing Division
designed to lead the computing model for non-LHC experiments at Fermilab. The FIFE project fosters close collaboration between experimenters and
computing professionals to serve high-energy physics experiments of differing scope and physics area. The project also...
Dr
Vikas Bansal
(Pacific Northwest National Laboratory)
03/08/2017, 14:04
Computing, Analysis Tools and Data Handling
Presentation
The Belle II experiment at the SuperKEKB collider in Tsukuba, Japan, will start physics data taking in 2018 and will accumulate 50/ab of e$+$e- collision data, about 50 times larger than the data set of the Belle experiment. The computing requirements of Belle II are comparable to those of a Run I LHC experiment. Computing at this scale requires efficient use of the compute grids in North...
Adam Moren
(University of Minnesota, Duluth)
03/08/2017, 14:21
Computing, Analysis Tools and Data Handling
Presentation
The NOvA experiment at Fermilab is a long-baseline neutrino experiment designed to study $\nu_{e}$ appearance in a $\nu_{\mu}$ beam. The detectors' fine-grained design and many resultant channels, coupled with the variety of physics triggers, the high-intensity NuMI neutrino beam, and the large cosmic ray muon rate at the far detector location, together result in computing requirements...
Matteo Cremonesi
(FNAL)
03/08/2017, 14:38
Computing, Analysis Tools and Data Handling
Presentation
The CMS experiment relies on the excellent performance of event reconstruction algorithms and timely processing of data and simulation samples for the prompt development of physics results. But the large trigger rates, complicated event environment and excellent performance of the LHC have posed many challenges for the CMS software and computing systems. We will describe recent developments...
Mark Neubauer
(University of Illinois at Urbana-Champaign)
03/08/2017, 14:56
Computing, Analysis Tools and Data Handling
Presentation
Realizing the physics goals of the planned or upgraded experiments in high-energy physics (HEP) over the next 10 years will require the HEP community to address a number of challenges in the area of software and computing. In order to identify and prioritize research and development in scientific software and computing infrastructure, a broad HEP community planning process is currently...