# NLO Multi-leg

### From Wiki Les Houches 09

(→Subgroups working on four main topics) |
(→People in Subgroups working on four main topics) |
||

Line 4: | Line 4: | ||

== '''People in Subgroups working on four main topics''' == | == '''People in Subgroups working on four main topics''' == | ||

- | [[Observables]] | + | * [[Observables]] |

- | [[Higgs phenomenology]] | + | * [[Higgs phenomenology]] |

---- | ---- | ||

## Revision as of 13:37, 9 June 2009

**Les Houches topics for WG1 (Standard Model and NLO Multi-leg: Session 1 only)**

## **People in Subgroups working on four main topics**

**1. Collecting results of completed higher order calculations**

The primary idea is to collect in a table the cross section predictions for relevant LHC processes where available. Tree-level results should be compared with higher order
predictions (whatever is known) and K-factors defined for specific scale/pdf choices. The table should also contain information on scale and pdf uncertainties. The inclusive case may be compared with standard selection cuts.

Producing such a table would, of course, include a detailed comparison of results originating from different groups.

A collection of plots showing distributions of relevant observables comparing tree and higher order corrections may also be compiled.

A common format for storing parton-level event information in a ROOT-ntuple should be developed; the ntuples would contain the 4-vector information for the final-state particles, the weight for the central pdf as well as the weights for the error pdf's (such as from CTEQ6.6), the LO weight for the event, and possibly the event weights for different choices of renormalization/factorization scale. A scheme for storing such ntuples for higher order processes in accessible locations (CASTOR at CERN, ENSTORE at Fermilab for example) should be developed.

See also item 5.

**2. Higgs cross sections in and beyond the Standard Model**

This issue is too important to be just a sub-part of point 1. Note that in former workshops a separate Higgs working group did exist. Special attention will be given to higher
order corrections of Higgs observables in BSM scenarios (coordinated with the BSM group).

**3. Identifying/analysing observables of interest**

Of special interest are observables which have an improved scale dependence, e.g. ratios of cross sections. Classical examples are W/Z and the dijet ratio. New ideas and proposals
are welcome.

Another issue is to identify jet observables which have no strong dependence on the absolute jet energy, as this will not be measured very precisely during the early running. Recent examples are jet sub-structure, boosted tops, dijet delta-phi de-correlation... This topic has some overlap with the BSM searches and inter-group activity would be welcome.

**4. Identifying important missing processes**

The Les Houches wishlist from 2005/2007 is filling up slowly but progressively. Progress should be reported and a discussion should identify which key processes should be added to
the list. This may also include relevant NNLO corrections. This effort will result in an updated Les Houches list.

**5. Standardization of NLO computations**

A standarization of NLO computations has, of course, may aspects. Different groups would benefit from the possibility to use and exchange different public symbolic/numerical routines to
perform computations. Given the boost such an initiative may induce in the field, such a discussion is highly relevant. An agreement to submit results/tools/code to certain databases for
storing and accessing (e.g. under HEPTOOLS, HEPCode web pages) would already be progress, but is only one aspect of the issue.

For NLO computations, there is a natural split between real and virtual corrections. One could agree on finite output formats for both sectors, such that event re-weighting functions could be defined which could be used in combination with existing event ntuples and/or in combination with (semi-)automated matching of NLO to parton showers (with Tools and Monte Carlo group).

In many cases, it is relatively easy, a priori, to isolate the scale-dependent terms for a higher order cross section. If the scale-dependent terms are isolated in such a fashion, it may be possible to easily calculate/store the scale uncertainties as discussed in item 1.

**6. IR-safe jet algorithms**

Detailed understanding of jet algorithms will play an important role in the LHC era. Much progress has been made in the last several years concerning IR-safe jet algorithms. Studies and
comparisons of different jet algorithms in the NLO context are highly welcome. Of particular interest is how the observables map from the parton level inherent in the pQCD approach to
the particle/detector level.

**7. New techniques for NLO computations and automation**

New techniques (based on unitarity and/or Feynman diagrams) may be compared concerning both efficiency and their potential for automation. Recent developments concerning automated
IR-subtraction indicate that the NLO real emission can be treated in a very general manner, independent from the loop part by using tools such as SHERPA, Whizard, Helac, Madgraph, etc.
Communication with the Monte Carlo group will be very important here.

**8. Combination of NLO with parton showers**

The combination of NLO parton level calculations with parton showers (e.g. MC@NLO, POWHEG, GRACE...) is an important issue for the development of higher order Monte Carlo tools for
the LHC. Again, this will be in collaboration with the Monte Carlo working group.

**Comments on a Les Houches accord on standardisation of NLO computations**

The demand for precise predictions for LHC phenomenology has lead to remarkable progress in the last two years. Monte Carlo programs based on efficient matrix element event generators are on the market which can deal now with multi partonic final states of order 10 particles. Sophisticated methods like multi-channeling, colour and helicity sampling, etc. allow for efficient and numerically stable phase space integration.

The combination with parton showers needs merging procedures which relate soft/collinear and hard phase space regions whit each other, an issue which will be discussed extensively in the Monte Carlo group.

It is very interesting to note that if one wants to promote general purpose Monte Carlo tools to next-to-leading order (NLO) precision, mainly two features have to be added to the developed tree-level technology.

1) Soft/collinear subtraction terms to reshuffle infrared (IR) divergences between tree- and loop-level NLO contributions.

2) A reliable method to evaluate one-loop amplitudes

ad 1) several groups incorporate now dipole subtraction terms a la Catani-Seymour in their ME generators which guarantees highly efficient evaluation phase space integration and an automated frame work.

ad 2) Both the unitarity based and the Feynman diagrammatic approach are capable to evaluate one-loop 2->4 matrix elements. The used algorithms are thus validated and the production of many results along the same line is feasible.

It seems to be exactly the right time to start the discussion on how to combine the tree- and loop-level developments into a more unified and standardised framework before the different NLO groups start their result production phase. It is evident that synergies between the two fields would speed up phenomenological progress substantially. One should just think about how often the wheel has been reinvented in the context of amplitude evaluation and phase space integration!

To promote NLO matrix elements into transportable code which can easily be incorporated into a ME set-up only a very limited number of conventions have to be fixed. In detail the I/O handling of

- 4-vectors
- (Standard) Model parameters
- colour information
- helicity information
- UV treatment
- IR treatment

has to be agreed on.

Note that concerning quantum numbers a loop amplitude corresponds to its LO counterpart. An agreement on certain "industrial standards" must of course not hinder or complicate in any way scientific progress. Using a switch for different information levels would allow for flexibility, e.g. if some groups might just want to provide colour/helicity summed IR/UV subtracted amplitudes this should be of course possible.

We would like to encourage everybody in the NLM and MC working group to discuss these issues at Les Houches with the aim to come to an agreement on how to pass efficiently results in form of transportable computer codes. This would open the door and would be an important step towards a phenomenological description of LHC data at the next-to-leading order level.

The NLM conveners