Energy Frontier Workshop >> It looks like people are slowly getting from break. >> People are walking in so give me another minute maybe. >> OK. >> Uh-huh. >> Recording in progress. >> The was a very good assortment of snacks and delights in the coffee room. That's why it took a bit longer to come in. >> Too bad to miss that. >> Should we begin? OK. OK. OK. So please take your seats. And it is my pleasure to introduce the next two sessions. First of all, we will have the EF05 and EF06 focused on QCD by both groups. I will be chairing the session from the room and Stefan is the main chair during the session and he will be on Zoom. We will take turns in taking questions from the room and from Zoom. Then during the first hour we will have four talks that are, well, basically focusing on physics topics. We don't have a formal program for the second hour which will also involve the EF07 group. We will start discussing the joint report from three groups. The second hour will be very freestyle discussion and you are welcome to stay, you are welcome to leave, but with this, I pass the floor to Stefan who will introduce your first speaker. >> Stefan: Thanks. As already said, hello everyone. Welcome to the session. We will have four short contributions on topics which have featured in the EF05 summary talk on Monday and they tie in with the important ones of focus questions on precision strong coupling, precision measuring and quantification of perturbative and non. And we will start with David d'Enterria on strong coupling. >> David: Hello. Can you hear me? Very good. I have been asked by the EF05 conveners to present an overview of the Overview of aS paper that you can find in the archive. This is the outcome of the workshop with many presentations. We have the link to the Indico agenda. We received about 4 letters of intent in Snowmass 21 related to aS extraction and the conveners gave me the homework of preparing a white paper discussing the current state and the experimental precision that one can reach on alpha s. We decided to organize a meeting with theories and experiment at least addressing all these. We wanted to know from the theory point of view what is the current state of the pQCD and QCD and corrections and uncertainties and provide a wish list of new Colin theory and data -- common theory data and development needed to reduce the alpha S uncertainty. What are the sources of the systematics and status of uncertainties and future reductions in both and current and future machines. Based on this homework, we organized a workshop in February together with Stefan and others and the outcome of this workshop was this white paper with 70 authors, 130 page and 80 figures as you see here with 10 sections of the document that I would rapidly summarize today. The motivation as you know, QCD coupling determines the strength of the strong interaction between quarks and gluons. Determined at a reference scale the running of the coupling follows the evolution here and you can determine the reference scale and alpha at the Z pole mass and you can see how the uncertainty has developed in the last 30 years from 6% uncertainty down to 2.5 and now a breach in 2013 of about .5%. Today it is about .8%, a little higher, because we have many more extractions and not all of them are fully -- well they are fully consistent but the average doesn't allow us to today reach an uncertainty below .8. The first motivation is the QCD coupling is the lease precisely unknown of all interaction couplings. The additional coupling is even better known than alpha so that's already a good motivation to study this. This is important and calculates Higgsino hadronics and decays and the uncertainties on glu-glufusion and tth production and the parametric uncertainty. Same for top mass calculations going from msr scheme to a full mass scheme you have today an uncertainty of 70 MeV. And the total with the ratio of bottom two are the two leptonic decays of the Z boson and so on. Also, approaching the Planck scale. It is the least important of the uncertainty but still it is an order of magnitude where the vacuum becomes unstable. You want to test now colored sectors and so on of course. Today alpha is determined by the combination of 7 experimental observables. Two predictions at NNLO and NLO and the Z pole. If you check the PDG value you see 7 categories with all individual measurements and the global average. Each category has uncertainty between .7 to 3.3 or 2.6 from plus and minus hadron. And we run in detail through all these extractions and how things can be improved so let me go rapidly through each seven categories. I won't deal with the largest uncertainty. It is partially assuming to be lattice one but I will cover the other six. Rapidly, alpha s from lattice QCD is the most accurate extraction today. The idea is to compare short distance quantities like the QCD static force or energy light quark and heavy quark currents and so on. You compare these short distance quantities communityed at NLO in pQCD in lattice data which is constrained by experimental values like hadron decays, factors and so on. And then you have various extractions. The nice thing is here the community has more or less agreed into a series of criteria by which each one of these extractions has to fulfill in order to be incorporated into the average which you can see here..118 and the current uncertainties are had uncertainty on the pQCD in the truncation of the expansion of the observables. The lattice spacing which is mostly determined by the computing power you have. The future projects is that you can half the uncertainty in alpha by improving your computing statistics, reducing the lattice spacing, computing high order pQCD corrections and also extending the step scaling method to the prenormalization scale is uncertainties reduced. For the tau lepton decays, this is now the NLO-3 and experimental at the uncertainty, the uncertainty today is driven by the OPE calculations of this theoretical expansion. You see the progression here gives a small difference. CIPT has higher moment and this translates into an alpha lower and higher in reverse order. And also how they deal with the non-pQCD corrections. The important thing to note is the there are non-pQCD corrections here which are not small. It is 10 times more than the tau mass. This brings in at least a 2% uncertainty that you need to somehow reduce by experimental data. And the hadronic function measure is not accurate in some regions. In order to improve we need N 4LO. And reconciling the fopt versus the cipT. We need better data. And better spectral functions needed. And belle-ii could provide results and that will reduce the non-perturbative corrections. We have the computed N3LO for the structure functions and NNLO for PDF and global values. The idea is to incorporate a new darks s structure function results and polarize quantities at the IC. And the global fits, the four global fitters also -- sorry. Apologies. I went back too much. The global fitters now have somehow larger alpha s closer to the average. Dis and fixed target prefer lower alpha whereas the LHC data prefer the higher. You can push one up and the other goes down. We need N3LO here and with that and improving the level fits we can add more data and reach about 2 per mill. This is a value that should be supported by all global fitters. For events shapes and jet rates you see a broad number of expansions regarding whether you correct Monte Carlo or analytically. This give as large uncertainty. There are new developments improving eventually power corrections here and you see them here compared to less evolved, less developed calculations, as well as using the mix to reduce the impact of non-purturbative. Power correction for shapes and reducing the grooming should help. Belle-11 could help. We use top fair and -- top pair and W and Z boson. Let me go to the next slide rapidly. We compare computed with different PDF and we get results close to the average. Both for the ttbar, 117 and for the W and Z, 117 and 118. We want to put all these global fits here and since they are done every five years the alpha extractions are interesting at the moment because there is an issue on how to do with high order missing uncertainties in global fits. This is a new extraction from a higher jet at NNLO. This is a nice thing because the upcoming will be incorporated. The uncertainty here is mostly driven by missing higher order corrections. And the last one is the alpha s from electroweak precision fits. This is extremely clean. Extraction with no NP uncertainties because it is very far away from the QCD. This requires control of uncertainties because the dependence on alpha s goes only through high order corrections. Only when you start to change gluons you start to be sensitive to alpha S. For the hadronic width of the Z, 90% is purely quantity and only about 40% are PQCD and electroweak corrections. You need to accurately determine the Z peak pseudoboson. In order to get to 100%, you need to multiply by 25 today. You get propagating from 1.5 to 2% today. You can extract this with a global electroweak fit leaving alpha as a single parameter you get the same result. At the conference we heard about a new Z and W based alpha s extraction with higher electroweak corrected and corrected Z lepton data and this improve agreement based on the extraction on the Z boson and global fit. There was the first NLO using the boson data which isn't comparable. We only have 50,000 at lep. We don't have any constraining power an alpha s unfortunately. In order to get per mill uncertainty and that's the only observable that can reach it. We need a machine like fcc with 10 to the 12Z and 10 to the 8 and category plus very much reduce parametric uncertainties on M and Z and W and in the case of the matrix elements in the case of W. The current extraction of alpha s, the current perabola is con ISISant but if it stays there -- consistent, when we have a future collider running at the Z pole we will need attention. If the W boson will average and stay there you clearly see those two extractions are inconsistent. This is an interesting stress test of the standard model. Just to summarize, we have a table in the white paper where we present the current theory and experimental uncertainties and total uncertainties on each one of the methods. The way to have these uncertainties, we have studied in detail how -- each one of these uncertainties by including all these things I mentioned in my slides. You can go through it here. This is a summary. And then also we also provide the wish list of how to reach uncertainty in experimental and theory developments. We have four items in the wish list, community, lattice QCD and the N3LO and computing resource and power to reduce the lattice spacing uncertainties and also to extend the step scaling to more observables and we need people computing one order of QCD theory higher of the lattice observiles. Theoretical efforts are needed for tau decay, power corrections, resummattion, and multi jet observables and lots of HepData at the LHC can be compared to NNLO pro predictions and this can be done in hadronic collider PDF based extractions. And multiple new LHC precisions and ultimately reaching 2 per mill uncertainty in principle. Hadronic Z and W decays is the only non-lattice method known that can reach per mill or lower precision and we need the machine like the FCC. That's it from my side. Thank you. >> Thank you for the great summary. Do we have questions? >> Yes, if you wish to ask a question, please raise your hand. Michael? We will get back to Zoom. >> Could you go back to slide 13, please. Yes. Yeah, down at the bottom of the right-hand plot, the an abatte and Hoang analysis get to part per mill but the central value is off from what the lattice gives. Those papers are, you know, they are not silly papers. They are very serious theoretical analyses, very sophisticated. It is really a mystery why those two results don't agree. I think it has to do with things we don't understand about how to produce the final states in e plus/e minus which is much easier than producing the final states at a hadron machine. Frankly, I made a comment like this earlier in the week, that the reconciliation of this method to the lattice methods is going to teach us more about modeling QCD than anything else. It really is a priority. So I think that's a very interesting way to look at this. >> Yes, this has been a long standing issue since a few years the fact that these two extractions with the state of it analytical calculations do not seem to agree with the everyerage. You see them here. -- average. You see there formulas here for both. They are developments here. Two things, first of all there are now improved corrections by -- and you see the calculations here for the C parameter, similar results expected for the thrust and apparently they now get clearly a different result and what you have in Monte Carlos that do not account for these power corrections properly. This analytical power correction, which are important for this event shapes, will be incorporated and then a new extraction has started soon. They demonstrated that those power corrections are large and are now in their control and now have to do the extraction. That's one development on the purely theoretical side. On the hadronization side, we have all these modern jet structures, drop and grooming and so on and so forth and you see the impact for the thrust also. This is the result for alpha s and you see if you apply different Z cuts, grooming, you see how the thrust min changes and you see it here also. The idea is that by including this analytical developments on the power correction plus reducing the hadronisation uncertainties with the grooming techniques that this kind of a span of extractions from all the observables will get much reduced. Ideally what I am trying to tell you is the Monte Carlo corrections will move -- will go closer and the central value won't be higher and they will average, move closer and those guys move closer hopefully. Otherwise it is an important issue indeed. >> Just one more comment. You have emphasized here the importance of the new BELL-11 dataset and the possibility of using lattice to understand -- I hope you are recruiting people in the belle-1 is to be sensitive to these issue. It could have a huge influence on how we understand QCD. >> I have been advocating for this for years. There are caveats. The [indiscernible] that can be accepted at belle-11 is not as good. And bogdon is in the audience so we can comment on this. And also jet shapes, sorry, jet rates at belle-11 it is difficult to extract jets with 5 GeV but I agree we should try it. The only thing I am saying is it is not as easy to do those analyses on Belle-11. >> What can be mentioned better with belle-11 is the shape. The normalization is better due to the boost of the taus. >> There is a white paper that discusses some of these issues. They have a plan so we probably need talk to them more about it. Are there any questions on Zoom? Comments? >> Don't see any hands raised on Zoom. Let's move to the next talk. >> Thank you. >> Can you kick me out from sharing? OK. Next speaker will be Stephen Jones on the precision calculations wish list. Stephen, please go ahead. >> Thank you very much for the invitation to speak here. Apologize if this is not as polished as I would like. I stepped in the last moment but hopefully I can give you ovview of what happening as part of the precision -- les houch update. I will start with a brief background on Les Houches and then I will talk about the complementary goals and then a work of the progress which is the currently on going wishlist update and then two selected examples that have been completed since the last wishlist in 2019. As always, I give the Les Houches disclaimer. There is an enormous amount of work going on so trying to summarize it is impossible so this will be incomplete. I will try to give you a feel of where the field is going and I think some interesting results. Les Houches is a vibrant in-person meeting set over four weeks. You can see the output of the 2019 at these two links. In 2021 it was a mod est -- modest virtual event and I think this had a negative impact. The wishlist will be updated. And hopefully we can get back on track in 2023. Yeah, the vibrant interaction was replaced by an Indico and it didn't help the collaboration. I think it is important to bring the experimental and theoretical communities together in person to get the most out of these vents. What is the wishlist? All aspects of the an event are important. Higher order matrix elements, parton shower,PDFs but the wishlist folks on advanced in fixed order QCD and electroweak predictions for processes relevant at the large hadron collider and the high luminosity upgrade to the LHC. If something fits into this formula I show on the bottom, fixed order calculation, QCD correction or an electroweak correction or a mixed QCD and electroweak correction, it probably belongs in the wishlist. We try to keep a sort of running table that shows the most up-to-date calculations. Of course a wide range of things. Part of the goal here is to help experimentalists answer questions like are we using the most accurate results and also encourage them to think about what results they want and need and communicate them to theorists and for theorists it is a list of what what results and known and which theory advances allow higher order calculations to be calculated. The goal is to facilitate communication between experimental and theory communities. If it encourages them to speak to each other it is doing the job and the second goal is to encourage all experimentalist to stay up-to-date with milestone theory results and motivate theorist do is attack experimentally interesting questions. Three is a SXN -- Les Houches and Snowmass LOI put forward. There are a lot of the same goals. We proposed exploring higher order calculations needed for 33 or a 100 TeV colliders on the technical capabilities at the time. Rather than a sort of focus on the next five years, focus on a longer period of time and try to project forward and that could be an interesting exercise I suppose. Going back to the wishlist. It generally starts up with an introduction setting up the terminology and common approximations and this is important for encouraging to two students to understand what the other is requesting or doing. Then it has an overview of the relevant theoretical advances and the process table is the meat of the report. We cover Higgs, jet final state, vector boson and top-quark processes and each table associated by a summary or review of the literature and what existed in the last update, what's new and the current experimental outlook. Let me flash you some of the tables. I should that these are currently being updated so if you see a calculation of yours missing or something that you strongly desire, do feel free to communicate that to me actually because that will be interesting. One thing that is broadly remarkable is despite the challenges of the last few years there has been a lot of progress. Several items on the 2019 wish list were completed and marked with a red star the new calculations. Some of which were in the previous wishlist and some went beyond expectations. These are the new processes that have happened just since the last wishlist published in 2020. There are advanced in theoretical techniques putting new vents events on the horizon. We can actually think it is motivated to actually desire some of these things now that maybe would have seemed just too far in the future even a few years ago. And there is also significant theoretical work motivating adding new sessions and interesting theoretical work there. I know that's more ambishing experimentally but I think it is an important thing to keep an eye on because of course it will be a very interesting area to look at. Just to pretend to talk a bit about physics, I just give a single example of a pretty monumental calculation that was completed during the last update and this current update cycle and that's a gluon fusion at N3LO meaning we can go from inclusive like on the left to applying the experimental cuts to give fully differential results in the few fiducial valume of the detector. This raised a few questions. The perturbative looks reasonable. But there is a remarkable K factor which is what you hope for and expect. But when you apply the fiducial cuts and free scale it fails for repeating and there is strange artifacts appearing repeatedly. These are due to a funny interaction of it ATLAS cuts it turns out. There is a barrel end cap cut in there and there is also some repeating cuts and turns out when you ply apply all this and go from the bond state to radiating additional jets, it can have some influence on what you see in your perturbative results so there was a discussion last time around is this something we want or expect and what should we do. Turns out one way to cure this is to think differently about these cuts. It is somewhat similar to a discussion that happened previously regarding symmetric cuts and it was realized asymmetric cuts tend to be more perturbatively stable but now it turns out product cuts allow some of this behavior to be cured or avoided. There is also another solution. What's actually causing this issue is the fact that there are linear power corrections that appear when you try to restrict yourself to the fiduciary value. This alleviates the strange artifacts that were appearing but I think it is worth pointing out that these product cuts allow you to stick to a fixed order pro diction which I think is somewhat desirable in some situations -- prediction -- or one can resum. Then just to give an overview of what happened in Higgs. Progress is steadily beating down at the various sources of theoretical uncertainty. We just heard about advances that could be coming for alpha s. There is very large uncertainty due to PDF and alpha s. Then there was an uncertainty related to top quark mass effect that were neglected but have been largely removed and similar techniques could be used to remove uncertainties from bottom and charm effects. And then calculation of mixed QCD and electroweak corrections allowed the electroweak uncertainty not just to be reduced but to be known a bit more accurately so it turns out it was pretty important to calculate that. And then we still have uncertainties related to missing higher order and the scale uncertainty remaining at N cubed LO. And we are pointing out uncertainties and areas where we can do better. Moving on to vector boson production. Big ticket items were completed since the 2019 wishlist. There have been several significant new results for this which could have implications to fits at PDF especially at N3LO. And we reworked some situations just to capture advances in specific channels. You see theory results that are just applicable to particular choices of the vector boson and we wanted to capture what was actually going on. Lots of new processes calculated. Well, higher orders calculated for these processes. Just to give an example is Drell-yan is known largely. This unusual feature that is present in photon mediatored and W mediated and Z mediated production which that the N3LO uncertainty button is not always overlapping the scale uncertainty band for invariant masses. It seems the NNLO scale was small and underestimated and due to the different channels and highlights the importance of computing high order corrections even when you think you know something precisely enough. Also, I guess it underscores the importance of thinking about how he assess scale uncertainty -- is it the best thing to vary Mu R and F by a factor of 2 and assume it captures? We have fiducial predictions at N3LO and NN3LO. They see again these product cuts reduce some of the slicing errors to some of the near power corrections that come from applying essentially a slicing procedure and then trying to compute within a fiducial volume. This is broader factors appearing in many processes. Top, again, several new calculations. Important work and ttbar reduction and several items on the 2019 list completed. We added a new section for T and Z jet which is motivated by the fact this is measured and it probably belongs on the wishlist and appears as a background for vector boson scattering I think this is an important process we had on there. Where possible we try to catch new results and give credit even if they were missing from the previous wish list. In conclusion, I think there has been some really incredible progress since Les Houches. Many interesting topics have reemerged like mixed QCD electroweak corrections and so on and much more. I am personally looking forward to 2023 hoping it will be in-person at the beautiful school and thank you for listening. >> Thank you for this great talk. >> Do you have questions from Zoom? >> I don't see any hands raised. >> I would like to add more items on the wishlist. >> Excellent. Let me get there. >> Introduction why. In many experimental analyses like the Higgs and electroweak sector we are DMAMENT domfate -- dominated by uncertainties and most of them are parton shower uncertainties, matching scales. The way we do it, I believe, not perfect. It is highly unperfect. We are double counting uncertainties and take one model, say [indiscernible] versus Madgraph, we change the shower in artificial way, and so on. So a lot of things which are redundant we are counting and potentially some lack of fine-tuning say of the models we use. The experimental uncertainties are dominated by these. This makes how we asked experimentalists to reduce the systematic uncertainties but also it is a shame they are dominated by the way we treat theory or better model uncertainties. I would like to add to the wishlist a formation of prescriptions, how to assess these uncertainties, can be shower, matching scales and so on, in a way which is double counting and it is done properly and accurately. And it may turn out it is [indiscernible] in our experimental measurements but I think knowing we are lacking that how to properly assess these uncertainties. >> Firstly, let me completely agree with you, yes, I think this is very disheartening also from the theory perspective. From somebody who computes fixed order predictions to then see a partonshower applied and dominating the uncertainty that is also very disheartening from my perspective and I absolutely agree it is a critical point in the community to better understand how to use parton showers, matching scales, yeah. And all of the points that you just raised. I agree that it is very important point for the community to come up with say formation or prescriptions or to do the importance that one of these was mentioned in the last talk. P M's study and collaborators studying, you know, limits of the parton shower and trying to prove the logarithmic accuracy. I think this is important. From the perspective of the wish list, we focus on advances in fixed order predictions, that's sort of the agreement. That's not to say parton shower is not important and a lot of fixed order calculations that I mentioned the first thing people do is to apply a parton shower or resummation. I don't know if it belongs n wishlist and one reason I say is looking at the authors of the wishlist. We are not the experts in this and I would be very, very happy if parton shower experts said we would like to con tribitute to the wishlist and extend to have a fixed order and parton shower section. But I don't think we are the people who should be writing these prescriptions at the moment. But I do think it is an important point at Les Houches and Snowmass. >> I am saying they could be the best venue to gather people experts in the Monte Carlo, implementation and resummation and parton shower experts. I think you could be the driving force. >> Yeah, 100%. I absolutely agree with that. I think Les Houches is the right place for this discussion to be happening. >> This is a discussion that is also happening in a lot of the CERN working groups, right? The top working groups, electroweak, Higgs Working Group. This is the issue these days. Bring together theorists and experimentalists to assess specifically these kind of modeling errors that are sometimes very multi-layer and complicated to first of all come up with a general recipe and understand on a theoretical basis. Those are the places with activity is always present these days. >> That's true. Yup. >> So Stefan you are out there in Zoom land. >> I had my hand raised. Very good. Would you like me to say something about this? >> Yes, go ahead >> That is very good point that was just made. In fact, in the last several workshops there were various efforts to assess these uncertainties by doing comparisons between various parton showers that exist and are being developed for a long time that have different underlying assumptions like the transverse momentum in pythia and the showers in [indiscernible] and Sherpa which are lie on the different end of the spectrum of what is possible in the Monte Carlo. Why there is not like let's say a well defined theoretical handle of what the true uncertainty is because, OK, it is just different options for parton showers but nevertheless it may give you the rough estimate of what the true uncertainty might be like. If comparisons have been made for Higgs production, Higgs plus jet production, inclusive jet and dijet production, and Z+Jets and in the last also for the vector boson. It turns out in many cases the true uncertainties in the Monte Carlo pro diction are much smaller than what is found by the experiment because quite often the Monte Carlo setup used by the experiments are not consistent between different Monte Carlos. And one -- so Joey, for example, was heavily involved in this. We always try to advocate that whenever comparisons are paid in the experiment to involve and in order to make sure there is an apples to apples comparison in which case there is a smaller uncertainty than what is being found bye-bye -- by using Monte Carlo out-of-the-box. That is not to say the uncertain is small but they are small in the beginning. These studies are being done and I believe they will continue in the context of Les Houches because this is a very important point and we don't want to degrade the fixed order calculations. >> I think in the interest of staying on time, maybe we should PUFB -- move on to the next talk. Thank you, Stephen and Stefan. >> Putting back on my hat as a chair, so the next speaker, thank you, again, Stephen. Will be Jennifer Roloff with is summary of it ATLAS and CMS contributions. >> Can you see my slides and hear me? >> Yes. >> Great. So, yeah, as mentioned. I will be giving today a summary of mostly the QCD portions of the ATLAS and CMS white paper. As you probably are aware, ATLAS and CMS jointly submitted the wite white paper on the HL-LHC physics potential. It includes an analyses in the yellow report as well as many new projections. This week already there have been quite a few nice summaries of various parts of the white paper which I have linked here for anyone interested but you probably have access to these already if you are at this workshop. Today I will be covering the results relevant to the EF05 and EF06 and focused mostly on jet and photon cross section measurements. Photon cross-section measurements is inclusive and others that were not studied in the white paper. They are useful for a lot of different things in QCD. These are very useful for PDF fits for the gluon density, provide precise test of perturbative QCD predictions so you can see in the ATLAS measurement at 13 TeV here a comparison of the data to NNLO predictions as well as several other predictions. These also provide tests of electroweak corrections at TeV scale as well as tests of fragmentation models for these predictions. There is a lot of different physics they cover all related to QCD. The HL-LHC will provide the opportunity to extend the reach of these measurements to much higher energies in particular the projection here is for the inclusive photon measurement. Compared to 2 TeV being measured for et gamma at 13 TeV with 36 inverse femtobarn we can extend this to 2.35 TeV. Similarly if you look at the photon plus jet measurements and compare this to the existing ATLAS measurement, this will extend the range from somewhere around 3 TeV for the m gamma jet to somewhere around 7 TeV. This is a pretty large extension in the reach we will by increasing the dataset. As I mentioned a couple slides ago, one of the important things these are used for is for PDF fits. You can see that the impact these have on the PDF fits in these plots here where you can see a comparison of several sets to the ratio of them to the mmht-2014 set. You can see there are large differences between these particularly at high et gamma. You can see a comparison of the size of the experimental uncertainties to the differences between these PDFs. Yeah. Extending the reach to these high et gammas will be very useful for constraining these PDFs. The uncertainties shown here are the uncertainties from the ATLAS measurement at 3.2 inverse femtobarns at 13 TeV. There will be higher statistics which will reduce some uncertainties and there are probably improvements to the calculation. So, you know, this, the reach of these measurements may improve through improving the experimental uncertainties. >> So similarly for the high pT jet measurements, jet cross section measurements are useful for many different things in QCD as well. Notably PDF fits and alpha s extractions and improving the understanding of perturbative QCD. Again, just for the photon measurements, the large dataset means we will be able to probe jet cross sections at very high pT compared to what we are doing now. You can see the CMS measurement with 36 inverse femtobarn here which is able to extend to somewhere around 3 TeV. The ATLAS projection extends somewhere between 4-6 TeV which is obviously a pretty big increase in this. And yeah, similary you will be able to, you know, see performing these measurements for a wide range of repetities. This increase comes just just from the luminosity but in the center of mass emergency from 13 to 14 TeV. You can see the ratio of the cross section at 13 TeV with 150 inverse femtobarn compared to the 14 TeV with 3 inverse barns of data. You can see in the high pT tails especially you have much more data at 14 TeV due to the center of mass increase. So one of the things that covered in this is the relevance of doing different types of jet measurements. CMS explored the possibility of looking at not just inclusive jets but also looking at boosted w+Jets and boosted ttbar and the cross sections for these are all, of course, much lower than for the inclusive jets but you can be sensitive by using tagging information and event selection so you can be sensitive to these different processes. There is a lot of different physics that these different processes probe. I don't have time to go through it all but one thing that is remarked on in the white paper is this enables measurements sensitive to color correction. In particular, one of the variables that was considered is the delta phi between the two jets in the measurement which is sensitive to the soft gluon radiation. You can see that these different processes have different behaviors. For instance, the w+Jets, there is no color reconnection between the W boson and recoil jet and you can see the delta phi variable has different behavior. The point of this isn't just this particular variable will be interesting to measure but there are a lot of types of physics you can probe by doing various types of jet measurements not just with inclusive jets but with all of these different types of jets we have available and we will have so many more of them available as well. We will be able to extend the reach of these. And really make the most use of all the jets that we have. Lastly, just for photons, like I said jet cross section measurements are useful. You can see a comparison of several different PDF sets compared to the size of the relative uncertainties for the jet cross section measurement where the projections for what they will be. You can see that they are very large differences between the PDF sets especially at high pT. Yeah. And so, again, indicates that we might be able to help constrain these PDFs so that we have smaller uncertainties. And one thing that was looked at, this is not done by ATLAS specifically or either the collaborations, but was there was, yeah, a PDF fit done with several different projections of measurements from the HL-LHC and you can see a comparison here of the size of the uncertainties from the PDF sets between what ct-14 and the projections of what they will be at the LH-LHC and you can seetail be greatly reduced. These are going to be very important for improving our PDFs which will be useful not just for this measurement but many other measurements as well. Just in summary, the HL-LHC will provide a lot of physics opportunities for understanding QCD better. These measurements are crucial input for PDF fits, test the perturbative QCD, extraction of the strong coupling constant and understanding of parton sure and a lot more. With the huge dataset and increase center of mass emergency, this enables measurements in more extreme regions of high pT and high egamma and Etc. We only have HL-Hillary Clinton projections for a few QCD analyses but it doesn't affect the reach we have for studying this. This is just snapshot of a couple different standard measurements that are going to be very useful. There is a lot of new opportunities as well to perform new measurements and help our understanding of the QCD. That's all I have. >> Thank you, Jennifer. >> Do we have questions on Zoom? >> I don't see any hands raised on Zoom. >> Questions in the audience? Jennifer, thank you so much. This is a very interesting talk. I have a historical question. Photon production was used to constrained PDF before jet production and it was later realized jet production provides cleaner constraints on the PDF than the photon. As we look forward, if we bring back the photon projection as a constraint, does anyone know how they will compete against jet production? Both as a function of the pT as well as the luminosity? >> Yeah, not sure exactly. I know that in the HL-LHC fits here I do believe they included the photon measurements or photon projections of measurements in this which if I remember correctly covered like you say a very similar region of space for constraining the PDFs. I don't think it had, if I remember a breakdown of exactly where the sensitivity came from from each of these analyses, I could check but yeah, not sure exactly how competitive they will be. But PDFs are not the only thing those measurements are used for. >> It will be interesting to see. I see hands-on Zoom. >> Maria? >> This is not on this. I have another question different. Maybe it is also -- and maybe if anyone wants to comment on this first? >> Yes, since we did the analysis a few years ago together on the impact of isolated photon data on PDF, I can comment on this maybe. Of course, the advantage of photons is that the energy scale uncertainty and resolution is better in principle on the one hand. But on the other, the statistical uncertainty is larger because it produces many more jets. Also, there is a different flavor impact of the two things. I mean, at the LHC mostly is gluon content scattering that produces the guon squark and produces the photon. I think probably they should be started but at high pT and high X the proton photon is probably more sensitive than the jets. Jets are produced mostly at high ttbar and balance quark scattering. This should be backed up with studies. Indeed. The good news is we have isolated photon and we have NNLO jets and now calculations so both can be exploited for PDF extractions at high X also. >> OK. Thank you. >> Let's see. Victoria? >> A question on jet production. In the talk before this one, Stephen Jones point out the fact they discovered for [indiscernible] power corrections are very important. It seems especially when we are in the presence with the system of cuts. It seems standard NNLO let's say calculations sometimes are not enough if we don't include power correction. My question is maybe more actually to the previous speakers and the theory part. In the moment we have all this nice experimental data, what can be the impact of this power correction on jet production and how can we be critical for maker future and nice and precise PDF fits? >> I don't know. Stephen, do you have any thoughts? >> I think this correction needs to be studied. There are various kinds of power productions. The ones we talked about were in the differential distributions due to the fiduciary cuts. You can chose smart cuts -- >> You think the cuts can be chosen? From the theory side to calculate these effects. >> I think it fits into the issues not captured by the fixed perturbative calculation. It needs to be studied of course. Other questions and comments in the room? On Zoom? I don't see any, Stefan? >> No. I think the two hands have been raised. No, no, it is fine. Before we move, let's thank Jennifer again. >> Let's move to the last talk of the session which is Mary Hall Reno on the forward physics facility. >> I am unmuted. Very good. OK. Thank you for the invitation to give a brief overview of the Forward Physics Facility white paper. It is 429 pages packed with excitement that includes beyond the standard model and QCD and astroparticle physics and neutrinos. I am showing a small selection of figures from the paper. Please take a look at the wite paper and the more extended work that is cited there in. So you heard yesterday already from Jonathan Fang a nice the intro introduction to the FPF. He reminded us to capture the physics that would go down the beam pipe. A suite of experiments is an operation in a purpose built facility located 600 meters from the point and hundreds of centimeters. The neutrino detectors, there are three of them in this collection. The neutrino cross sections that could be measured there are in a new emergency range. All three flavors here shown are just the statistical uncertainties. For muon neutrinos and tau neutrinos the opportunity for the separated neutrino and anti-neutrino and for tau neutrinos, many more tau neutrinos than we have seen directly in the past. It is important to understand the systematics uncertainties and measurements and also that would be in the predictions for the fluxes that come from the QCD predictions that lead to the neutrino flux from the interactions at the ATLAS interaction point. The QCD elements then are, of course, in the proton-proton coca coalitions patience that yield all three favors. Through the neutrino interactions, understanding the parton contents of the nuclear targets. Collision -- starting here, looking at the hadron production to neutrinos. On the left is pythia results for new e plus new e bar and per log energies bins. You can see the K-on production and decay dominate the low and mid energy of new e and new e bar and tron production and decay con tribitute the highest energy. On the right shows the next to leading order evaluation with the scale and tauitute -- tau neutrinos and the anti-tau neutrinos. It is dominated by the purely leptonic decay into taus and new taus and then the taus decays a well. Before I go on, just notice that in the left figure with pythia results, there are two different tunes shown here. The solid histgrams have the monash tune and a new tune including some additional data is shown there. For the QCD production of charm to look at the high energy neutrinos of electron neutrinos and muon neutrinos and for all of the energies of the tau neutrinos, you are probing new kinematic regimes of very small X on the one hand and very large X. This is because you are in regime of, well, obviously, very forward high repeaties and talking about seven and larger contributing to the neutrino forward flux of facility. In the small X region we are interested in PDF fits and their uncertainties. There are large log one over X contributions and different approaches to the resummation and looking in colinier and KF factorization approaches -- collinear. And then there is also the issue of small X gluon saturation with the large PDFs at small X. The figure on the lower right shows the comparison between gluon PDFs with and without small X and this is characteristic at lower Q and of the total quark distribution. This figure also illustrates the small-X uncertainties not just between with and without small-x resummation and shows the large X uncertainties. For the large-X as we say, large-x uncertainties and there is the potential through this neutrino tagging of high energy charm potential to probe the charm C and potentially intrinsic charm. This figure in the lower right here shows the final X distribution for D and then translates to an enhanced high energy neutrino flux at the forward physics collider -- forward physics facility. Including hadronization and pT and beam remnants and particle and anti-particle asymmetries. In terms of QCD in the neutrino nuclear interactions, the range here of 1 TeV neutrinos extend the X and Q coverage for nuclear targets. It is a chance to compare and contrast nuclear corrections for neutrino nucleus versus charge lepton nucleus DIS and it is complementary to EIC results. There is the opportunity for understanding and determining the strange PDF with inclusive and di-muon production. That strange production is tied. In addition to the dominant charge scattering there is neutral current scattering and opportunity to investigate low Q and hadronic invarant mass contributions. There will be a thousand or so events if a 10 ton detector over the life of the high luminosity phase and opportunities to understand better hadronization and final state interactions and DIS and Monte Carlo modeling of neutrino interactions with the same Monte Carlo tools used. Cosmic showers, including hadron multiplicities, the production of forward strangeness and its connection to atmospheric muons, and the connection of the atmospheric neutrino flux to charm production in the atmosphere by cosmic ray interactions. If you would like to endorse, there is a Google form link here and I also point you to the short FPF white paper posted on the archive in the 5. There are other related contributed white papers and the ones in yellow are presented today but many interesting contributions that discuss the physics opportunities with the FPF facility. Thanks. >> Thank you for this nice summary. Do we have questions? I don't see any hands raised on Zoom? >> Do you have questions in the room? >> Michael as question. >> Hello. This is really nice. Let's go to slide 5 please. >> Yes. I hope you didn't say the change of tune in pythia is the estimate of the error in these predictions? >> No, I did not say that. >> Yeah, this is important point. Pythia is known to not do very well on the underlying vent in pp collisions even in the central region. In the forward regions it is almost completely uninformed by data. The data on forward hadron production you have to go back to ISR I think. >> My point here is there is work to be done in this area. You want a Monte Carlo for your K-on production and what you have off the shell isn't going to work. >> I really agree with that. I agree with what you said that this is extremely important for the cosmic gray experiments and interpreting the highest energy neutrino results for example ice cube. We really need this. But on the other hand, the uncertainties here would seem to completely swamp uncertainties from studies in scattering. >> Yes, I think one wants to do working between tau neutrino and high electron neutrino and fluxes and clearly it will involve understanding the lower energy electron neutrinos and the Monte Carlo modeling of their production and so on. It is -- it isn't just measuring the thing and you know what you have. >> Knowing these fluxes is going to be very important so thank you very much for proposing this. >> This is a significant part of the reason why it took so long. If you look at the diagram for the typical production and neutrinos that are registered on the nuclear target 620 meters below, there are multiple QCD components involved to have a reliable prediction. I think if the facility must be considered with the eac, experiments on nuclear target and the process with the calculations, that's why again I think it is a very challenging but also very interesting project because it brings multiple pieces together. >> Yeah. Also let me just point out pythia completely ignores the presence of intrinsic charm. If there is charm the distributions are competely different and really interesting from the point of view of cosmic rays. I was just agreeing with the fact it is interesting because there is work to be done. >> She agrees she said. All right. Anyone on zoom? >> I would like to make a clarification to Michael on the knowledge of particle production. One slide backwards. We know the regime is much better at the ISR times because we have been running LHC-F, the dedicated experiment that has 0 degree cal meters and they have been able to measure photons by 0 and I think neutrons down to beam repeating beyond 7.58. This measurement have allowed to do this with pythia and it is not accurate to claim that we know the hadron production better than the ISR times. Of course, charge [indiscernible] are another thing. In principle, you assume that at least the neutral energy fraction of the pp coalition at 8 and 13 TeV we know accurately. We know it much better now than the ISR times. >> I think we all agree there is a lot of room for improved understanding. I think that's the point here. >> It is a comment to David. According to my understanding it is not revisiting the model and for all the fraction. It is not so strongly. The models are not so strong. All this is non-perturbative physics is not super strong. This is obtained to improve and retune and just to stay with the same model and routine. I think there is attempt to change the models. Within pythia. I want your comment >> The way the fraction is an open issue. You know that well. In principle the type of activity we are dealing with in which you produce forward neutrinos and forward Elastics that produce forward neutrinos. You get an interaction and you have a proton surviving. I am not saying pythia is any gauge of uncertainty for forward neutrinos. All I am saying is we know it better tan before the LHC started. All this forward and ongoing experiments and others will allow us to improve this much better under constraint and much betther models. The interest is there. We have to do all these measurements and we know those within a factor of 2-3 I would say in terms of cross sections. Neither of them could be guessed by some people. >> It is not just a matter of this. If we want to make it a neutrino production. We always have one parton in little X and one in large X. We need an approach that is also the large X part. You can see in the typical configuration both contribute to the fact. Small and large X. You have to understand that. OK. Any other questions or comments about this? Let's thank Mary Hall Reno. >> Thanks again and thanks to all the speakers of this sessions. That bringstuse -- brings us to the more general session. >> I can [indiscernible] is connected? It is right? >> Yes, I am. >> Thanks, again. I will go to talk a little bit -- well you are welcome to stay or leave. We would like to discuss a little bit of the plans for the joint EF05-EF07. There is another big part of our report dedicated to the electron collider and heavy ions. There are two other people who are interested and we didn't have time to discuss them. I don't know if you have any comments on that or again maybe we could start a little bit about summarizing the paper and what is in it. >> So, of course, it is a community wide paper which I mean like a 100 pages long as usual and that mean people and it is mainly catgorized and touches upon five things which are interesting for high energy physic community. One is some of the electroweak and muon standard and the precision and determination of hadron structure is the most important. That goes into thesis needed for high energy physics. There are other things like physics subject and some precise way to understand the jet structure and some heavy flavor physics. The other part which the community was interested in this case was physics at the small X in in the structure of protons and hadrons and small-X region. That's broadly it. Do you want to know anything more specific about it? I was thinking about how to incorporate this and your thought and others in having this joint report. Both also in the heavy ion physic and the ICK there are parts that go beyond QCD. If we don't have to defragment it we have to keep it more general otherwise it needs to be fragmented in various places. >> Well, that's actually a question to Laura maybe. We discussed the possibility of writing this joint report meaning it is about 45 pages in total divided by 15 pages per frontier but there will be joint parts. One possibility is we include this EIC heavy ion physics as a part of the report. Is this something that will work on hand for you? Swagato as a convener of the group and the whole frontier in general so I don't know if you have thoughts or preferences. >> Generally, my preference would be to keep this in tact in one place so that rather than people going to different frontiers because it is sort of a somewhat different facility from the nuclear physics side. It would be probably useful for people to keep it in tact in one place. Particularly for the IC and heavy ion sides. >> Could this be a separate contributed paper which would be part of EF? >> I know you are talking about the report. But the fact he wants to keep it in one place all together. >> If we are only talking about one report from QCD only from EF5, 6 and 7 and we have to fragment into other places like the electroweak and pMSSM and things like that. >> It doesn't have to be all together. We gave you freedom to decide what is more appropriate. For year one and two would be happy together because they have common theme but in the QCD part if you see the advantage of separate sessions it can be that way. >> I am asking whether there is a possibility of considering a joint report for EF 5, 6, 7 or whether it is appropriate to have the report which contains mostly QCD but slightly beyond it. >> I think we can accommodate the electroweak and pMSSM part and just make sure other groups are able to find that material, right? >> I don't see why not. Laura? >> If there are things that belong and would be part of the electroweak session and benefit them they should be in there. I don't see why they have to go into QCD as strong interactions. I understand the matter of keeping it all together but if they benefit other discussions they should come up in there at least as well. >> Maybe I have a similar question. We just heard about the precision wishlist from Stephen. Many processes are not just QCD but are QCD plus electroweak so I think we are in a slightly similar situation. We would think this belongs to EF05 but these calculations are QCD plus electroweak. >> I think it is case by case situation. If the point you are making is just the improvement of the predictions due to the calculation of the higher order QCD corrections, of course, the discussion of that in general belongs to your session but the impact of what you are doing may benefit a study coming up in the electric -- electroweak session. It is a fine line. >> One thing we could do is -- >> I am talking about what Stefan just brought up. Segre >> In that case, what would you say? What if we write the joint report and dedicated 5-10 pages that's a clearly defined section with the implications of the stud from the other group and another part dedicated to the wish list and the -- is is that something that might work? >> That is something certainly from our side I would prefer but it is up to the whole community. >> I would still prefer to have it in one place rather than like for example taking a wish or put the wish in the EF03. >> Since these are small and marginal thing for the whole thing, if you fragment it to too much it could be impossible to find it. >> I agree. Answering the question Stefan brought up you need to know what you are talking about. Are we talking about the effect of certain specific QCD calculation of studies on certain observables that at the end are electroweak observables put us in your session but with what you are saying that the effect of that are mentioned where they belong. >> That does make sense? >> Yeah. It makes sense. >> The other question that I have is we have this big white paper. I think we are still missing executive summaries for the white paper. I could imagine there is a general message that goes for the whole community and then there is more group specific or topic specific messages. Do we want to coordinate or at least communicate to the coordinators of the white papers that we need these two messages going into the reports and do you have any thoughts about it? >> At least for the EIC white paper the way it was written and in a sense what we and others did, I don't know I didn't do anything but there is a few bullet points made to be written in a way that is picked up directly. >> I know Stefan has this nice executive summary for the white paper on the showering programs. But I think for example, we have to think about the other wite papers and how to do that. Any other thoughts or questions on what we want to discuss? >> I think it might be good to create a draft option similar to what was shown earlier and then maybe come back to the community and ask questions and get some input before we produce more text. Make sure it makes sense to everyone. >> Maybe just another quick comment. On the BSM part, I have tried my best to sort of go through also so that the white paper submitted on your categories and try to -- but as you say there are some that are pretty large. It is very likely that we might have missed something. We tried even today at the power session there were reference s to those. Let's just make sure we keep in touch and if we have a list of these results, it is great if you send it to around and we double check we didn't miss anything or -- some are obvious like particles. It is like we just cover the whole range with different types of signatures and contribute one part of that plot. But there might be others that are less obvious and yeah. >> OK. >> Just to reciprocate. When we were writing the EF05 and 06 papers what is missing is a wishlist. Stefan has a wishlist for pseudo-calculations and again, it would be very helpful to know some processes or studies that you envision where let's say the QCD uncertainties dominate and right now we mostly bring up Higgs physics. There will be more examples of that and we can communicate about that. >> Yeah. I think it applies much better there and yeah. For that search, there might be some cases so keep in mind. It is definitely not as strong. >> Right. >> If you have any questions about the inputs, please, let us know. >> And right now regarding the request for the bullets from Friday meeting. We can't get the conveners together to put them by today. The question is what do you need by Friday? We may be able to have a Zoom meeting before that but it is difficult to get everyone together. The other conveners have some well commitments which slow down the process a little bit. Pub you can think about minimal delivery products for Friday. >> Maybe some input just from you could be useful. Just bullet points. And we understand it is by no means final or close to final but at least to have some feedback from a wider community. It can be yours. You can say this has not been discussed yet. Bullet points of what is your vision of the summary of the report you want to have results of what the work looks like. >> The white papers were just submitted. We want to start thinking about that. The sooner we think about it the better is. >> Does that sound good on Zoom? Is that something we can produce by Friday? >> Sounds good to me. >> We will try to put something together. >> OK. Anything else we need to discuss? That's it? Sorry. >> OK. Thanks, again, for those who stay on Zoom and thank for this little conversation. And I think unless there is anything else, we can just close the session. >> Thanks again to all the speakers and for the discussion. >> OK. Thank you, guys, we can call it a day.