A lot of attention has been paid to the applications of machine learning methods in physics experiments and theory. However, less attention is paid to the methods themselves and their viability as physics modeling tools. One of the most fundamental aspects of modeling physical phenomena is the identification of the symmetries that govern them. Incorporating symmetries into a model can reduce the risk of over-parameterization, and consequently improve a model's robustness and predictive power. As usage of neural networks continues to grow in the field of particle physics and as research in computer vision has demonstrated the usefulness of exploiting symmetries in data via network design, there is renewed interest in embedding the symmetries relevant to physics problems in neural networks which analyze them, as a means of applying physically-meaningful network constraints.
We present our work on Lorentz group-invariant and equivariant networks, within the context of problems including jet tagging and particle four-momentum prediction. Building off of previous work, we demonstrate how careful choice in the details of network design -- creating a model drastically simpler than the traditional approaches -- can yield competitive performance. Such symmetry-respecting networks may not only serve as powerful analysis tools themselves, but by design may offer insights into which composite physical observables are relevant in particle identification and measurement tasks.
|In-person or Virtual?||In-person|