Minutes of the June 26, 2013 LBNE simulations/reco meeting
Tom Junk, Matt Szydagis, Maxim Potekhin, Tyler Alion,
Kevin Wood, Norm Buchanan, Ryan Wasserman, Andrea Shacklock,
Mark Convery, Qizhong Li, Herb Greenlee, Rick Snider,
Jonathan Insler, Zepeng Li, Josh Klein, Stan Seibert, Sanjib Mishra,
Apologies to those omitted.
Jonathan has been working on a module that performs the operations of unpacking,
deconvoluting, and hit-finding all in one operation. The reason for this is so we
do not have to make a recob::Wire object in the event record which takes a lot of memory,
and is largely empty. Jonathan Asaadi has promised to make an alg out of GausHitFinder,
but for now, Jonathan I. has put a snippet of the hitfinder code in his own moduele.
Since the zero-suppressed data are stored in blocks of consecutive ticks, it is natural
to perform the deconvolution in the time domain. Jonathan will extract the time-domain
deconvolution kernel from the signal shaping service, truncate it so that a finite duration
set of raw data samples deconvolutes into a finte deconvoluted data set, and pass that
on to the GausHitFinder one block at a time. The truncation length of the kernel
is fcl-controllable, currently defaulting to 30 ticks. Tom suggests plotting the kernel
and comparing it against the inter-plane spacing divided by the drift velocity. A narrow
pulse of charge should have an impact on the wire signal bounded by that amount of time
(possibly both before and after the passage of the charge).
Jonathan had to write a new constructor for recob::Hit to allow it to be produced without
an associated recob::Wire.
Jonathan has also upgraded the zero suppression to include the measured ADC values before
and after a block of values above threshold, commonly called "nearest neighbor readout" mode,
though the number of ticks to read before and after the block is fcl-controllable, and tested it.
Jonathan is also investigating low-charge hits that Tyler found in the MC.
Stan asks if this is nearest neighbor in time or in space. Currently it is in time, but we can
imagine also writing ADC's out on neighboring wires if an unsuppressed value was written on that
tick. Stan said that nearest-space-neighbor writing is more difficult in the DAQ, since electronics
modules would have to communicate with each other.
Tyler has been working on disambiguation algorithms using clustering. We have had mixed
luck with hit-based disambiguation -- in busy events with showers, there are many combinations
of U,V,Z hits and it is easy to get the wrong one. Tyler has started running 2D clustering
algorithms on the channel data first, and disambiguation is done on the clusters. Tyler tried
out several clustering algorithms -- DBScan, Kinga cluster, and Fuzzy clustering, finally
settling on fuzzy clustering as it is quite general. Unfortunately, the geometry of the
induction planes in LBNE make the mapping of channel number to spatial position discontinuous.
Tyler has modified the fuzzy cluster metric to wrap around. He is consulting with Ben Carls
on how to make this general so that we can re-use the same code and not have to split it off.
The hits in a cluster are likely to be together spatially, which helps the disambiguation
process immensely -- some parts of the cluster may be easier to disambiguate than others,
and the ambiguity choice can be carried over for neighboring hits.
The clustering may just end up getting called twice, once for disambiguation and once again
during the rest of reconstruction, or Tyler can save the pattern-recognition stage clusters
in the event record.
We also seek characterization of the performance of the algorithm. Tyler and Kevin have started
writing an analysis module that tallies up the fractions of correctly and incorrectly
disambiguated hits. We would like to generalize this to an analysis module that reads the output
of the tracking and calorimetry modules and accumulates energy histograms for estimating
energy resolution. Jim Stewart wants to know what the impacts of wire angles of 45
degrees vs. 36 degrees has on energy resolution. Sanjib is interested in the energy resolution
of pizeros as a function of pizero energy. Tom suspects that the mechanism by which
disambiguation affects energy resolution is if a cluster is divided into pieces, only some of which
are accumulated in the energy sum, and that the first attempt at this with full simulation and
reconstruction will not be our last attempt and may not be that exciting for physics sensitivity.
Zepeng has been working on photon detector digitization and a raw event display. In LBNE,
we have nominally 1200 paddles, and >90,000 time samples per channel. This poses an interesting
problem for a raw data display, as it is more pixels than can fit on a screen. Zepeng showed
a display of a muon event that traverses the volume viewed by four APA's, with the time axis
extending a few microseconds.
We suggested that zero suppression be implemented for the PD digitization -- even though it
represents less data than the TPC, we still win big in data savings.
Zepeng asked about PD geometry in the 35T prototype. Tyler has put code int he geometry
perl script but disabled it until a final design is available. Jim Stewart said Dave Warner
of CSU has the plans for where the paddles are located.
Tom showed some slides from the meeting last week with ICARUS collaborators. We need
help with reconstruction, as well as cosmic-ray modeling. Danele Gibin showed details
of ICARUS's mixture of automatic and visual scan reconstruction, focusing on the automatic
parts. For the physics measurements, cosmics are rejected automatically, event objects
are identified visually, and automatic methods are used to reconstruct the event objects.
A nice feature of ICARUS's reconstruction is that they can get 18% resolution on muon
energies just from multiple scattering. Paola Sala outlined areas in which ICARUS and LBNE
can collaborate on software.
At the meeting, Carlo Rubbia proposed that ICARUS share raw data with LBNE in order to test
reconstruction algoirithms with real data and not just MC. We plan on starting this conversation.
It will require us to unpack, apply fixes, deconvolute, and find hits in ICARUS data. We also
need ICARUS geometry defined in LArSoft.
There are minutes attached to this event.