EF05 Contributions Michael Begel [Standing by to begin] ALESSANDRO: Hello, everyone. We don't see any more of the room. Is it still connected? >> Hello, Alessandro. ALESSANDRO: Hello, Michael. I see nobody's in the room. One person. Oh, no, okay. We can see them now. That looks like we are missing the audience in the room. >> So, I guess you want me to wait? ALESSANDRO: I guess so. >> Hello, can you hear me? >> Recording in progress. ALESSANDRO: Okay. We will wait a few more minutes. >> Yes, please. MICHAEL: Let me at least try to share the screen and make sure everything is functional. >> Are we shooting for 25 plus 5? ALESSANDRO: For the presentation? Let me check the agenda. It's 20 plus 10. >> Got it. Thanks. STEVE: Let me check. 20 plus 10. MICHAEL: Everybody should be able to see my slides now. ALESSANDRO: Yes. We did. Full screen. Good. Yeah. So, each presentation is 20 minutes plus 10 for questions. So, we start at 1:00 eastern time and we will finish this session at 2:00 and we will continue from 2:00 to 3:00 with the same format. >> Should we start? Is there somebody who is sharing their slides? I think the next  MICHAEL: Yeah. That's me. >> Oh, is that you, Michael? MICHAEL: Yes. And I think I have my slides. ALESSANDRO: Yes, you are. >> Yeah. I can see your slides. Why don't we get started. Welcome to the afternoon session of the  yeah, EF I guess topical survey  I don't know what this session is called. But yeah. So, suffice it to say, our first talk is from Michael Begel on the precision QCD of the energy frontier. Go ahead and speak. MICHAEL: I'm presenting on behalf of EF05, pQCD. After a brief introduction, I'll talk about a few of the topics that we're covering. In particular on the strong coupling constant, on event shapes, jet substructure and fragmentation functions and some results from the HLLHC and some expectations for forward physics. So, as we all know, QCD is a firmly established theory. It has a very rich phenomenology with predictions. The quantitative is challenging, limited by PDFs and perturbative expansions, and least well known in the coupling Standard Model. And to say this up front, QCD is not the driving force for many future experiments. But it is crucial for understanding them. It has an important role to play as we go forward. So, we presented a series of focus questions at the beginning of the Snowmass process. I'm not gonna go through them here. But they're listed here. Some of these I will talk about today. The others will move into the proceedings. So, again, I want to thank everybody for submitting many interesting white papers for EF05. Again, this overview only contains a small selection. And I want to thank the authors particularly for providing slides and note that all mistakes and misinterpretations are ours. And not those of the authors. So, let me began with alpha S. Of course, the strong coupling constants is extremely important. Measurements that can be made. It's quite interesting in QCD, there's many different ways you can measure it. There was a comprehensive white paper presented for Snowmass which is linked down at the bottom of the slide which covers, actually, many different ways that one can extract alpha S from data from theory. And compares them together. And so, this summary table actually gives a very nice overview both in terms of many different propers to extracting alpha S, and as well as the current state of the theory in experimental uncertainties. And then in the rightmost column, the near and longterm future progress. So, what you see is that we're actually doing fairly well. The world average is a bit less than 1% precision on the alpha S at MZ. Theory calculations are typically at NNLO or higher. Lattice in particular is a very strong contributor with a dominant systemic  a dominant uncertainty that's actually slightly lower than the world average. And you can see all the different methods that you can use to extract alpha S play into this because each has their own theoretical and experimental systemic uncertainties that can be beaten against each other. In the nearterm, which primarily means data from the HLLHC. And the expectation from the world average is expected to drop to roughly 2. And moving to future collider facilities, we expect it to go down even more significantly. So, just to look at this in a little more detail, if you look at alpha S at LEP versus the projections for the FCCee. And you can see all the LEP values clustered around 0.12. And you can see the measurement extract the oneonone LEP. The expectation for the FCCee is represented by this very narrow band. It's placed at the same value as the LEP extraction. Just to show what a comparison would be between the parabola at LEP versus what we would get from a GigaZ collider facility. It's amazing, we gelt a 10% overall improvement in the precision for this. It should be noted that the current world average is slightly low compared to this average value. And this, of course, it's not significant at present. But this gives an opportunity in the future as we probe alpha S to continue looking at this in very high precision and look again for sources of new physical phenomena that could lead to deviations in the coupling. Now, it's not  it's important not just to know alpha S at one point at the mass of the Z, but it's also important to look at its running. And so, one option here is to look at a large electron ion collider like the LHeC might be. In particular on the righthand plot, you'll see a plot of the running of the alpha S compared to direct measurements. Now, most of these measurements are extractions from jet cross sections which overlaid here from many different experiments. One of the advantages of having a large DIS experiment that can span many orders of magnitude in this  in this process is that we can actually get from a single accelerator, a single set of experiments a direct measure or an extraction of the direct measure of alpha S across the full, you know, a very large range in energies. And in particular, if you look, the green band here represents the effectively the world average uncertainty. But also the uncertainty that you'll get from this facility which would be marked by these black points here. And so, again, this is a really amazing opportunity for us to have a very high precise precision measurement of the strong coupling and of its running. Now, there are other ways, of course, of extracting alpha S to or from the event shapes and from energyenergy correlators. Both of these are  have been heavily used in the past and are continuing to be used as we move forward. In particular, it should be noted that these have different sensitivities to nonperturbative and perturbative physics effects and there's been many advances recently. One aspect I should point out is that different regions of different measurements in the top plot here is basically a cartoon showing what thrust different regions for a thrust variable look like. Different regions of these distributions have different sensitives it to nonperturbative and perturbative effects and to different orders in the calculations. And so, we can use this to disentangle pieces together. In addition, new advances that have taken place in jet physics have provided ways of mitigating the nonperturbative corrections and factors through jet zooming, for example, and expansions to the perturbative calculations. This is something that's being actively studied. Particularly in this case for the HLLHC and for the EIC where energyenergy correlators in particular are something that gives you very nice differentiation between the different pieces. Can be used to track the trans verse distributions for the distribution functions of the partons. And these calculations, these observables, or at least closelyrelated observables, are being used for extracting information directly from jet events at the HLLHC. So, if I move to this, and look at jets and jet substructure, of course, jet substructure has been looked at from the very early days of the KT algorithm. But it's emerged as a tool in the LHC era and low temperature a role in future collider experiments. It gives use a strong understanding of jets for standard model measurements as well as beyond standard model searches. And, of course, all of this has a very broad connection with the rest of Snowmass. Not just for EF05, EF06 and 7, but in terms of searches, top physics, physics and theory and computational. There's many interesting jet substructure signatures. You have jets from originating standard model particles, from Higgs and bosons and topquarks. That's an entire industry all of its own. It's also useful for mitigating backgrounds from pileup and other experimental effects. This is something to keep track of and understand moving forward. Because there are a lot of different facilities, proposed. And these type of techniques and technologies are actually quite useful at those facilities. Independent of the measurements that we're trying to make. You can also use them for novel signatures. Such as dark displaced jets, photon jets and delayed jets. One key point here, though, is that theoretical advances in improvements in the Monte Carlo modeling of QCD radiation has been absolutely essential in the development of this jet substructure technology. And we really expect that the further improvements will include tunings, the improvements of calculations to higher orders, and as we have seen in many places, applications of machine learning and artificial intelligence for theoretical understandings. We're continuing to make real advances in this area. This is a very  this is a growth area, really. And it's benefiting greatly from theory in experimental exchanges of ideas and direct collaboration. Now, as I mentioned before, future collider experiments and environments will be a challenge for jet substructure. Electronpositron, you really do require a very highprecision understanding of QCD. It is a very clean environment. At least for some of the e+e colliders that have been proposed. For others, they're less clean. This is something you can disentangle the very smallangle emissions. There's a bit less emphasis here on high momentum boosted objects like, for example, top jets. But still, it is a very important area that will be, I think, of great use as we try to extract information about color flow and differences between initial and final state radiations. Now, at the muon collider, the higher collider energies are on par with LHC. But it should be noted that the experimental environment has significant beam backgrounds and these are going require novel noise removal techniques similar to what's developed for hadron colliders. This is something to pay attention to. The technologies will be quite useful in this context. And we should acknowledge that highenergy hadron colliders will require more granular detectors at larger boosts. And, of course, here we really do expect to have higher pileup. And that means you have to have advances in detector technologies like the inclusion of timing, precision timing in your detectors and further development on the algorithm front. So, jets, you know, jet substructure has a key role to play in any future collider. This is really a breeding ground for innovation. There's a lot of examples given in the white paper linked below. And again, the support for this interdisciplinary work has really been absolutely crucial for the progress. It's really a connection between different communities, theorists, experimentalists, detector, physicists, computer scientists getting together and understanding how these elements can actually be beneficial as we design the next generation of experiments. Now, on the other side of this, we have fragmentation functions and hadronization. Which provides access to the nonperturbative aspects of QCD. It's not yet amenable to lattice calculations like we can with the PDFs and alpha S. All of this is very important for not only what is going on at the LHC right now and also for the EIC, but these things will be very important for the future as we look at  at the fragmentation functions in different contexts of the future colliders. Now, belle II is taking data that could provide unique and crucial data that will inform the fragmentation functions. In particular, it should be noted that most of the Monte Carlo tuning for fragmentation is old and relies on the LEP data. The next is a lower ES, a lower arm on the queue, lets you check the extrapolations and lets you add to the fragmentation functions that you're deriving. And so, this, you know, an integrated approach is really important between the experimentalists and the theorists. The new theoretical developments provide a basis for the leading hadron correlations. And also the impact of a number of different smaller effects like the defcon effect for massive partons is important and interesting to import into this work. So, this brings me to the HLLHC. Of course, Atlas and CMS will continue to provide amazing precision measurements that will inform our understanding of QCD for decades to come. Again, I'll just note, QCD is not a driving force for many future experiences. But it is absolutely crucial that we make these measurements, precision measurements, and understand the underlying physics that's going forward. So, plotted here, just three pieces from the pub note that was put out by the two collaborations. On the left is the inclusive jet. So, this is a plot of the PDF uncertainty which is dominated by gluons. Versus PT. The gray band represents the sea tech 14. The red band actually represents a possibility of how good we can get as far as the gluon PDF using the jets. And personally, I am quite amazed to ever see a plot that had similar to 1% uncertainties on the gluon. It's astounding. In the middle is the direct photons. Again, here, we're looking at now  looking at different comparisons to PDFs. This is also sensitive to the gluon PDF. But again you'll note, actually with the uncertainties are quite low. And actually, they're dominated by the electromagnetic uncertainties in the calorimeters. This is a place we should be pushing harder and something to think about as we move forward into the new generation of colliders. Here we're talking about half a percent uncertainties or less on EM objects. What do we need at an FCCee? And on the right hand plot, it's really cool, it's the correlation in ttbar events. And I've made measurements on delta phi on quite a few experiments. Making this experiment in ttbar with extractions of alpha S is truly an amazing feet. This is all things we can do with the HLLHC data that's yet to come. >> You have about 3.5 minutes left. MICHAEL: Thank you. In addition, we have the possibilities of doing forward physics at the HLLHC. There's been a proposed facility to be placed about 602 meters from the ATLAS interaction point. Shown on the upper right. You have basically the central physics appearing at ATLAS. This is basically a neutrino DIS experiment. You have correlations to the other experiment. But it's effectively a neutrino beam at the LHC. One of the energies up to a few TFE. This gives phase space in the forward hadron for large and smallX space. Important for the functions and the new PDFs. Plotted below is a kind of standard X and Q squared plot. You see, for example, this red range is covered by experiments from the LHC, the blue range here is the EIC. The red is where a proton facilities would work. This is for Dmesons, charmed quarks. You can see you have really nice measurements and coverage at very low X, very low Q. But you have high coverage at high X and high Q. Not measured in 20, 30 years now. This will be interesting to use modern techniques and relook at regions of phase space. And all this is actually quite complementary to the EIC. This is a neutrino DIS, the EIC is charged lepton DIS. Of course, I can't go without saying something about theory and modeling advances. Obviously, advances in QCD rely heavily on improvements in theoretical calculations. N cubed, N to the fourth, LO calculations. There's been a lot of activity over recent years represented in the theory frontier. Several of the topics require a deep interplay between the theory and experiment. For example, the alpha S, the jet substructure and PDFs as you'll hear in the next talk. You don't have experimental advances without further developments in the Monte Carlo generators. And the efficiency, this is like big data event production which is needed for colliders. We need billions, if not trillions of events. This is a computing challenge as well as a theoretical challenge that really needs to be understood and adapted to for the next colliders. And, of course, you know, we  Monte Carlo generators have come a long way from the early days with just leading order trade level diagrams. The inclusion of hire order levels and weights is crucial for really getting the event shapes correct and understanding what we're looking at in the data. So, in summary, QCD is an active and vibrant area of physics with significant recent progress in both theoretical approaches and experimental measurements. We are comparing higherorder calculations against precise data. There's a lot of innovative techniques takes advantage of the latest tools and technology. , for example, in lattice, simulations, taking advantage of machine learning where we can. There's a lot to remains to be learned from the HLLHC and the EIC which are the next two colliders that should happen. There's a lot of important and unique results that will continue to come out. And these results will inform our understanding for decades. And I just plotted here on the bottom, this is a really cool plot showing the running of the beam mass and the measurements from the HLLHC. Which are these big  the large uncertainty with the red point. And a value measured at LEP on the Z pole. So, what's one of the really cool things about this is that you're using the Higgs Boson to decay bat BB bar and making a QCD measurement. We have extracted something where we have measured the beam mass now at two energy points and we will do even better with that at the HLLHC as you move forward. You can see the uncertainty has dropped dramatically. And once we move to a Higgs factory, this is going to become an ultraprecise measurement. We will be able to look at this in great detail and understand whether or not there are deviations showing up in this very crucial parameter. So, while QCD is not gonna drive the choice of the next accelerator, it will impact its utility. And it will impact the requirements for the detectors that have to be built. So, thank you. And I want to thank everybody for the great inputs. Both the slides and the white papers. >> Thank you, Michael. I guess we'll start with questions on Zoom. And I see one hand up already from Michael. Please, go ahead. >> Yeah. Thank you. So, you mentioned the forward physics facility, 600 meters. We had an expression of interest, but didn't become a white paper for a hadron spectrometer between 820 meters. We were developing along with particle search there. But it has potential in that region for measuring very forward lepton pairs and charm and so on. So, particles come down very small angles through the beam and get bent by the dipole of the LHC. To the left and right. And in principle, one can identify the radiation with detectors. We didn't include it because there's no white paper on it. But the potential is there. It's good facet. And we can measure Epairs and mu pairs and low masses. But if one can do particle identification in this region, one could also measure the direct charm production which is very interesting to do. But it's not gonna happen any time soon unless somebody takes it up. But there's information about that. But it didn't become a white paper. I wanted to mention that. MICHAEL: Thank you, Mike. It would be interesting to see that. It would go in where one of the AFPs is currently. >> No, the AFPs are down at 240 meters. This is  MICHAEL: Closer, okay. >> Right. So, between the D1 dipole, high luminosity era at about 80 meters and the tan absorber although 180 meters. It's just a straight pipe. However, to make larger pipes along the particle search there. But if one had a vacuum chamber with windows on the right and left side, and the region point, 2.3, that have been bent by the beam, by the dipoles. And you could make a 10 meters long spectrometer there in front. Now, there have been developments  there's a group developing transition radiation detectors. And they're doing this for TED hadron. But for run 5, it's definitely a possibly. And I should say we shouldn't turn it off without having measured hadrons in the region. Which has not been done since the ISR days way back. The ISR they measured forward lambda C production. But it's not been done since and really should be done. MICHAEL: Thank you, Mike. >> Michael, you showed the large improvement expected in the accuracy about the S from a terra Z facility. And you compared that with the current measurements of S. I was wondering if there have been any estimates from any other future facilities besides terra Z on we could expect to improve on alpha S from them? MICHAEL: The one on the LHeC, which is a DIS machine, also shows  I mean, I guess I don't have the extraction of the number here, but it's in the white paper. Which shows where you can, you know, you can see the improvement  >> Do you know how the accuracy compares to the  how much reduction of the uncertainty it is for that facility? MICHAEL: Not off the top of my head. I know it's presented in there. I think  here. It shows about .2%. >> That's pretty good  >> As a followup, are there estimates from the alpha Z for EIS. MICHAEL: I didn't see any. Good question. >> I can comment. This is David speaking. This was discussed in the workshop if you want me to give some quantitative values here. The specificity of EIC is the use of polarized  of polarized beams. Therefore, we have access to a beam  functions. And then you can use your  in order to have an independent extraction. And then where a study is presented, and Carolee, the good news is from the theoretical side, this is an observable. The not so good news is the current data propagates into uncertainty of about 5% today. So, it's not really competitive. However, the  I see people provide an estimate that this can be reduced by a factor of 2 down to 2.5% or so. That's using just your role. I don't think you can go much better than this. The observables, inclusive DIS, or charm or jets. Because of the  I mean, compared to the available luminosity and the center of mass energy is not so good. So, really, EIC will not go below 1%, I think. It's 2.5. Maybe adding other observables maybe in the 1 to 2% ballpark. But to go to the levels, you need a machinelike SSCH. >> Another question from the room. >> Hi, Michael. You made a very good strong case for jet substructure. But can you give a bit more specifics about you said there could be many improvements. Do you have some examples? And how much  how much of an improvement we might be able to expect? MICHAEL: Improvement  the question is improvement in which? There's a lot of different places where improvements are happening. >> Yeah, for example, in the signatures. In the standard model particles or dark jets. MICHAEL: There's a discussion in the white paper. It's not particularly detailed in terms of numbers. But there are links to specific studies and we're gonna  we're gonna compile that as best we can towards the Snowmass report. So, I don't have any numbers to give you right now. >> But this is based on the new algorithms? Or better theoretical understanding or where does it come from? MICHAEL: It's really, I think, a mix. >> Okay. Okay. Thanks. MICHAEL: And perhaps somebody from the community wants to comment. >> I guess I can comment. But I wasn't on this white paper. But, you know, there's kind of a broad range of algorithms that are  there's always new algorithms coming on. But actually, I'm gonna ask a question and make a comment at the same time. There's been a lot of kind of recent developments towards looking at particle composition of jets. Right? Look at the particle content. And with fast timing detectors and other innovative particle ID coming on, there's potentially a lot of room to explore there. I think this is actually what has become particularly compelling recently is actually the structure and the heavy ion collisions where you can start to make strong statements. You will here about this more in YenJie's talk. But yeah. Just to followup, was there any discussion on particle composition in jets from Michael? MICHAEL: I mean, not, you know, particle composition in terms of fragmentation functions, yes. But not as particle ID, no. >> All right. We have a question here from Michael. Actually, hi. >> First of all, Michael, thank you very much. It's a really beautiful talk on QCD. I have two comments one which have I hope is noncontroversial. And the other may be a little more. So, let me start with the first one. You pointed out that it's important to think about the theoretical effort needed to generate these QCD predictions. And I just like to double that. I remember in the early 2000s, there was something called a Lazarus wish list, which was to get NLO corrections to certain threebody processes at the LHC and it was thought have been very difficult. In the decade of the 2000s, there were revolutionary advances in QCD computational methods. And now you can get all those things just by running MadGraph. It's really amazing. MICHAEL: Go ahead. >> Yeah, we don't yet have that for NNLO, but it's definitely coming. And it needs to be supported. And in particular in the US. It's almost impossible to get an assistant professorship if you work on computational QCD. But nevertheless, there's enough activity in Europe that this is going forward. And it's something they hope you comment on in your report. This is a subject that's very important and needs support. Okay. That's comment one. Maybe someone else wants to talk before I go to comment two? >> Excuse me, Michael, in the interest of time, would you mind maybe saving this far parallel session or a coffee break? We're a few minutes behind. >> Let me just ask the question and then people can think about it. I think collider physicists underestimate the importance of this lattice determination of alpha S. It's really beautiful. It's got a small error. And it's gonna keep getting better because the experimental input is just the psi and the epsilon masses. And it's going keep getting better because the computers are better. We want to reevaluate why we want to measure alpha S at colliders. There are definitely good reasons. I don't need to explain them now, but talk about it later in the week. We need to change our focus because the lattice is giving us the real information. Thank you. MICHAEL: All right. Thank you.