Minutes of the July 3, 2013 LBNE simulations and reconstruction meeting
Present: Tom Junk, Zepeng Li, Jonathan Insler, Kate Scholberg,
Matt Szydagis, Tyler Alion, Mike Kirby, Rick Snider, Stan Seibert,
Norm Buchanan, Ryan Wasserman, Andrea Shacklock, Brian Rebel, Eric Church
Apologies to those omitted.
Rick reported that Lynn Garren has made a plan to migrate the
repository and build system for LArSoft to use git and cmake,
respectively. We discussed the need for an LBNE repository
to store LBNE-specific code that is not managed by LArSoft,
but instead by LBNE. Much of what has been written so far
is tightly tied in to LArSoft, such as geometry tools, though
some of the algorithms currently under development are rather
specific to the LBNE detector configuration. We have the flexibility
to move code out of the LArSoft repostiory into detector-specific
In discussion after the meeting, we tentatively propose to be
one of the early adopters in this, to make the new LBNE repository
based on git and cmake. A separate lbne software account will be
set up with its own group, so that file protections can be set
to protect the repository code from inadvertent overwrites, but
to allow reads.
Zepeng showed progress in adapting the event3D event display
written by Morgan Askins at UC Davis for the Water Cherenkov detector.
It uses the GLUI and GLUT packages and has a very nice interface with
sliders, checkboxes, and a virtual trackball that can rotate the event
in 3D. It runs on the gpvm nodes and with some installation,
runs on Mac OSX. It is not yet integrated into LArSoft. Zepeng suggests
reading in rootfiles written by a LArSoft job. This can work to develop
the event display, but as a two-step process may be clumsy. We should
look into ways to integrate it into LArSoft, though it does depend on the
external GLUT and GLUI packages.
It shows shaded cryostat volumes and colored photon detector components.
It does not (yet) show electron drift data, which are not inherently 3D anyway --
that requires reconstruction.
There is a movie feature -- you can watch events evolve in time, which is what
the photon detectors measure best. The resolution isn't quite so good as to see
a muon propagate from one side of the detector to another, but it is very clear
to see cosmic rays arriving at separate times.
Brian suggests that we get in touch with Nathaniel Tagg who wrote a nice 3D event
display for TPC reconstructed data, and which also has a 2D display for
unreconstructed TPC data.
Ryan Wasserman gave a status report on simulating the CSU liquid argon photon detector test
system using LArSoft. He's using Fermilab computers as they were unable to get LArSoft working
on CSU computers. The CSU dewar is a cylinder with no TPC, and workarounds have been developed.
Ryan thanks Tyler for all the help with the geometry. Materials such as stainless steel and TPB
are included in the geometry description. The group suggests not ray-tracing every photon in the
acrylic bars, but instead using either Ben Jones's or David Muller's response functions which
are computed with standalone ray-tracing programs.
Ryan is able to build a MicroBooNE-like photon library for the 40L dewar. Fewer photon hits
are being simulated than expected, and investigations include turning the dewar walls into sensitive
photon detectors just to see if all the photons can be collected. Ryan asks if one can have more
than one kind of photon detector, and Brian replied that you can have that; only the beginning
of the optical sensitive detector name is used to identify optical sensitive detectors, and
suffixes on their names can be used to differentiate multiple kinds.
A discussion of repositories for this code followed. It was agreed that code for simulating
LBNE prototypes is LBNE code and should go into one or more LBNE repositories.
Tyler showed progress on cluster-based hit disambiguation. It's based on a modified
version of fuzzy clustering, with the distance metric in the U and V views adjusted to
account for the cyclic relationship between space and channel number. Tyler's met with
Ben Carls on clustering and Hough line finding. Tyler's developed
additional code for the APA geometry interface to do things like find wire intersection lists.
The algorithm works by clustering in the three views, and associating clusters across views
(currently visually, but perhaps can be done with timing and charge density?).
The endpoints can be disambiguated in the way that single,
isolated hits can be. There is currently a visual step to identify cluster endpoints.
We hope to have a fully automated disambiguation, even if if does not perform very well; we can
always optimize it over time.
Hits within clusters can be disambiguated based on where the cluster is located. Validation
on clusters that span wrapping boundaries and tracks that are long enough to hit the same
channel twice on the same side are to be done.
Some discussion focused around disambiguating isolated hits on the periphery of EM showers.
These are easy to include in the cluster if the event is empty of any other activity, but
once one adds in noise, cosmics, radiologicals and other stuff, the overlay of stray hits
onto physics clusters can impact the energy resolution. The loss of real hits however
also impacts energy resolution, in the other direction. It's a balancing act that has
to be played of how tight the clustering requirements are, and it will depend on whether
we are on othe surface or underground.
36 degree vs. 45 degree wires characterization of performance is on the to-do list.
Tom mentioned that we have requested raw data from ICARUS and enough code and help to read it in
and apply ICRAUS-specific software fixes to the raw data. We will need filter functions
and convolution kernels as well, but want to test our own deconvolution software. We will also
need to develop an ICARUS geometry for LArSoft. It is possible, but takes some work, to overlay
two ICARUS events in a fake-wrapped-wire-ICARUS geometry in a way that pretends as if there
is an ICARUS-like detector but with wrapped APA's. We can add signals from the two events
together in the U and V views and keep the collection views unambiguous, and attempt to pull
Jonathan Insler has been using the BackTracker to track down the low-charge hits Tyler identified
a few weeks back. We want to know if they are real or an artifact of the hit finding or
deconvolution or other data processing. Jonathan is also working on the module that finds hits
on zero-suppressed data with a time-domain deconvolution and not unpacking all the data into memory,
for speed and memory savings. We also have a need for a very fast hit finder for the 35T trigger,
which ideally does not even deconvolute the digits, or performs a very lightweight deconvolution
like a sum, possibly exponentially weighted.
There are minutes attached to this event.