Notes for Topic 2 Beam Reconstruction and Analysis, Session 1 Notes by Rob Kutschke Lead Spentz Scribe Gianluca Petrillo Notes Rob Kutschke Support Ruth Pordes Present in the room: David MacFarlane, Tingjun Yang (DUNE), Daniele Gibin (ICarus), Pavel Kryczynski (LArIAT) I don't think anyone was on the phone or readytalk LArIAT Use cases - B.1.1 - is already solved by the appropriate interplay of artdaq and art - Need use cases that describe: - how LArIAT communicates information to the downstream experiments - Do they want numbers, histograms ... ? - how downstream experiments request that LArIAT make a particular measurement - One possible requirement is that LArIAT make prelim results available in an electronically readable tabulated form that can be used by the downstream experiment - Data curation: - DUNE wants a requirement that they can reanalyze protoDUNE and WA105 for the full life of DUNE. - Does DUNE they want the same for LArIAT and others. - Tingjun says yes; they are still using Argoneut data to test new code. ICARUS Use Case(s) - Region of interest to be decided on by light info - Old ICARUS had coarse grain light collection - New ICARUS has fine grain light collection - (Is this really a trigger/DAQ requirement) - They have a requirement that they can do the necessary time correlations and communicate to art. - Spentz asked Daniele to describe the use case for interactivity with the reco algorithms - Could define intervals of interest in the waveform for hit creation - They were able to enable/disable hits on and off - To study vertexing they really did interact with the transformation of waveforms to hits and the assignment of hits to tracks. - Requirement that framework must support this - Includes being able to persist the output of the events that have been fiddled with by hand. Spentz asked if the DUNE use case - Vertex finding to better than 2.3 cm - What requirements does this imply - Conditions system must follow any operating conditions that threaten this - Hit finding must be able to resolve overlapping blogs - Algorithms must be able to deal with very busy vertices with many hits. Possible solutions that have algorithm implications - Dynamic exclusion zone near vertex based on hit density - Multipass vertex fitting, assigning hits to different tracks. - Daniele asked Tingjun about photon finding. - Is there an algorithm to do a last chance check for unassigned hits between the vertex and the converted photon candidate vertex. Maybe we did not see these hits at first because we did not know about the primary vertex. - Requirement that MC system can simulate physics in electron drift separately and electronics response separate - Conditions related stuff - In the sense of conditions info that are determined from data - Write all iterations to production db (or not) - Not write to production DB when doing development - Must support schema evolution of condiditions entity - Local mirror of a subset of conditions - Data on data overlay - Data on MC overlay - Discussion about LArSoft vs LArLite - Ruth wanted to write some sort of requirement that the system be able to be flexible enough to do analysis and to do algorithm development. - No one was sure how to say this - Her preference was that LArSoft be flexible enough to be used in this role (my preference too). Moving on to DUNE requirements B.9.1 - Spentz there is an energy scale requirement for DUNE - How to calibrate energy scale? - Tingjun: Ultimate is pi0 mass resolution next best stopped cosmic muons next best, through going cosmic muon also useful Michel electrons from stopped muons - Maybe the requirement is to be able to get a sufficiently pure sample of pi0s. Spentz noticed B.9.2 "2. Make a multivariate algorithm to tell you whether the showers are electrons or photons or pi-zeros. Find the vertex for some channels (use tracks — 1 or more)" - Rob says that this implies only run-time configurability to define database keys - Then we thought some more. Suppose that we store the MVA coefficients in a data base; suppose that the algorithm that uses it is in LArSoft. Whose data base holds the coefficients? Does each experiment store the coefficients in its own data base? Or is there a LArSoft DB shared by the experiments? Either/or? - This lead to a general discussion of shared data bases. Does it make sense for all SBN experiments to have a single shared DB? - Rob says be careful of this. Be sure to distinguish between elements of the schema that are accidentally the same today and elements that are intrinsically the same.