# NLO Multi-leg

### From Wiki Les Houches 09

Line 7: | Line 7: | ||

- | The idea is to collect in a table the cross section predictions for relevant LHC processes where available. Tree-level results should be compared with higher order | + | The primary idea is to collect in a table the cross section predictions for relevant LHC processes where available. Tree-level results should be compared with higher order |

- | predictions (whatever is known). The table should also contain information on scale and pdf uncertainties. The inclusive case may be compared with standard selection | + | predictions (whatever is known) and K-factors defined for specific scale/pdf choices. The table should also contain information on scale and pdf uncertainties. The inclusive case may be compared with standard selection cuts. |

- | cuts. | + | |

Producing such a table would, of course, include a detailed comparison of results originating from different groups. | Producing such a table would, of course, include a detailed comparison of results originating from different groups. | ||

Line 15: | Line 14: | ||

A collection of plots showing distributions of relevant observables comparing tree and higher order corrections may also be compiled. | A collection of plots showing distributions of relevant observables comparing tree and higher order corrections may also be compiled. | ||

- | '''2. Higgs cross sections in and beyond the Standard | + | A common format for storing parton-level event information in a ROOT-ntuple should be developed; the ntuples would contain the 4-vector information for the final-state particles, the weight for the |

+ | central pdf as well as the weights for the error pdf's (such as from CTEQ6.6), the LO weight for the event, and possibly the event weights for different choices of renormalization/factorization scale. A scheme for | ||

+ | storing such ntuples for higher order processes in accessible locations (CASTOR at CERN, ENSTORE at Fermilab for example) should be developed. | ||

+ | |||

+ | See also item 5. | ||

+ | |||

+ | '''2. Higgs cross sections in and beyond the Standard Model''' | ||

Line 45: | Line 50: | ||

For NLO computations, there is a natural split between real and virtual corrections. One could agree on finite output formats for both sectors, such that event re-weighting functions could be | For NLO computations, there is a natural split between real and virtual corrections. One could agree on finite output formats for both sectors, such that event re-weighting functions could be | ||

defined which could be used in combination with existing event ntuples. | defined which could be used in combination with existing event ntuples. | ||

+ | |||

+ | In many cases, it is relatively easy, a priori, to isolate the scale-dependent terms for a higher order cross section. If the scale-dependent terms are isolated in such a fashion, it may be | ||

+ | possible to easily calculate/store the scale uncertainties as discussed in item 1. | ||

'''6. IR-safe jet algorithms''' | '''6. IR-safe jet algorithms''' |

## Revision as of 18:58, 31 December 2008

**Les Houches topics for WG1 (Standard Model and NLO Multi-leg: Session 1 only)**

**1. Collecting results of completed higher order calculations**

The primary idea is to collect in a table the cross section predictions for relevant LHC processes where available. Tree-level results should be compared with higher order
predictions (whatever is known) and K-factors defined for specific scale/pdf choices. The table should also contain information on scale and pdf uncertainties. The inclusive case may be compared with standard selection cuts.

Producing such a table would, of course, include a detailed comparison of results originating from different groups.

A collection of plots showing distributions of relevant observables comparing tree and higher order corrections may also be compiled.

A common format for storing parton-level event information in a ROOT-ntuple should be developed; the ntuples would contain the 4-vector information for the final-state particles, the weight for the central pdf as well as the weights for the error pdf's (such as from CTEQ6.6), the LO weight for the event, and possibly the event weights for different choices of renormalization/factorization scale. A scheme for storing such ntuples for higher order processes in accessible locations (CASTOR at CERN, ENSTORE at Fermilab for example) should be developed.

See also item 5.

**2. Higgs cross sections in and beyond the Standard Model**

This issue is too important to be just a sub-part of point 1. Note that in former workshops a separate Higgs working group did exist. Special attention will be given to higher
order corrections of Higgs observables in BSM scenarios (coordinated with the BSM group).

**3. Identifying/analysing observables of interest**

Of special interest are observables which have an improved scale dependence, e.g. ratios of cross sections. Classical examples are W/Z and the dijet ratio. New ideas and proposals
are welcome.

Another issue is to identify jet observables which have no strong dependence on the absolute jet energy, as this will not be measured very precisely during the early running. Recent examples are jet sub-structure, boosted tops, dijet delta-phi de-correlation... This topic has some overlap with the BSM searches and inter-group activity would be welcome.

**4. Identifying important missing processes**

The Les Houches wishlist from 2005/2007 is filling up slowly but progressively. Progress should be reported and a discussion should identify which key processes should be added to
the list. This may also include relevant NNLO corrections. This effort will result in an updated Les Houches list.

**5. Standardization of NLO computations**

A standarization of NLO computations has, of course, may aspects. Different groups would benefit from the possibility to use and exchange different public symbolic/numerical routines to
perform computations. Given the boost such an initiative may induce in the field, such a discussion is highly relevant. An agreement to submit results/tools/code to certain databases for
storing and accessing (e.g. under HEPTOOLS, HEPCode web pages) would already be progress, but is only one aspect of the issue.

For NLO computations, there is a natural split between real and virtual corrections. One could agree on finite output formats for both sectors, such that event re-weighting functions could be defined which could be used in combination with existing event ntuples.

In many cases, it is relatively easy, a priori, to isolate the scale-dependent terms for a higher order cross section. If the scale-dependent terms are isolated in such a fashion, it may be possible to easily calculate/store the scale uncertainties as discussed in item 1.

**6. IR-safe jet algorithms**

Detailed understanding of jet algorithms will play an important role in the LHC era. Much progress has been made in the last several years concerning IR-safe jet algorithms. Studies and
comparisons of different jet algorithms in the NLO context are highly welcome. Of particular interest is how the observables map from the parton level inherent in the pQCD approach to
the particle/detector level.

**7. New techniques for NLO computations and automation**

New techniques (based on unitarity and/or Feynman diagrams) may be compared concerning both efficiency and their potential for automation. Recent developments concerning automated
IR-subtraction indicate that the NLO real emission can be treated in a very general manner, independent from the loop part by using tools such as SHERPA, Whizard, Helac, Madgraph, etc.
Communication with the Monte Carlo group will be very important here.

**8. Combination of NLO with parton showers**

The combination of NLO parton level calculations with parton showers (e.g. MC@NLO, POWHEG, GRACE...) is an important issue for the development of higher order Monte Carlo tools for
the LHC. Again, this will be in collaboration with the Monte Carlo working group.