There and Back Again: Interfacing between theoretical parameter spaces and their predictions.

US/Eastern
Description

Join us for two short talks centering around 1) fast predictions from theory with uncertainty predictions and 2) inverse mapping from experimental data back to theoretical predictions.

Recorded Meeting Video: https://www.youtube.com/watch?v=CuCnIhhU05k

    • 1
      Bayesian Neural Networks for Fast Predictions from High Dimensional Theories

      Current searches for new physics are often guided by beyond the Standard Model (BSM) theories that depend on many unknown parameters, which makes testing predictions computationally challenging. Bayesian neural networks (BNNs) can map the parameter space of these theories to a meaningful distribution of observables. The use of these networks is demonstrated by the new package TensorBNN by modeling the predictions of the phenomenological Minimal Supersymmetric Standard Model (pMSSM), a BSM theory with 19 free parameters. The predicted quantities are cross sections for arbitrary pMSSM parameter points, the mass of the associated lightest neutral Higgs boson, and the theoretical viability of the parameter points. All three quantities are modeled with average percent errors of 3.3% or less in a time orders of magnitude shorter than the supersymmetry codes from which the results are derived. The posterior densities, provided as point clouds, provide meaningful Bayesian confidence intervals for the predictions, further demonstrating the potential for machine learning to accurately model the mapping from the high dimensional spaces of BSM theories to their predictions.

      Speaker: Braden Kronheim (Davidson College)
    • 2
      Machine learning techniques to map from experimental cross sections to QCD theory parameters

      We map experimental high-energy scattering data to quantum probability distributions that characterize nucleon structure and the emergence of hadrons in terms of the quark and gluon degrees of freedom of QCD. We train three network architectures, a mixture density network (MDN) an autoencoder (AE) and a combination of the two (AEMDN) to address the inverse problem of transforming observable space into theoretical parameter space. Gradually increasing the dimensionality of the parameter space and hyperbox size of possible cross sections, we test the limits of this approach. The mixture density component provides the possibility of multiple-parameter solutions being produced along with their probabilities. This approach has been used to accurately predict collinear parton distribution functions to within one standard deviation and with a χ2≈1, comparable to the current fitting methods. This model constitutes a portion of a new AI-based QCD analysis framework.

      Speaker: Eleni Tsitinidi (Davidson College)