High Energy Theory
The research activities supported by the Theoretical Physics subprogram include, but not limited to: precision calculations of Standard Model processes, developing models that address shortcomings of the Standard Model; designing ways to test these models and interpreting results of measurements in the context of these models; identifying where new physical principles are needed and what their consequences may be; developing and exploiting new mathematical and computational methods for analyzing theoretical models; and constructing and exploiting powerful computational facilities for theoretical calculations of importance for the experimental program. Major themes are symmetry and unification in the description of diverse phenomena. The Theoretical Physics subprogram also provides funding for maintaining and upgrading important databases for High Energy Physics research, such as the Particle Data Group (PDG) at LBNL and SPIRES at SLAC. It also supports the educational and outreach program QUARKNET.
Relationship to Other Programs:
The Theoretical Physics subprogram functions synergistically with many of the other subprograms in the Research and Technology Division of OHEP. The Computational HEP program provides hardware support and software development that help improve the accuracy and precision of theoretical predictions. Improved calculations for parton distribution functions and scattering amplitudes are important for proton accelerator based experiments. Precision lattice QCD calculations play a crucial role in both proton and electron accelerator based intensity frontier experiments. Theoretical studies of neutrinos and the nature of dark matter and dark energy go hand in hand with non-accelerator based experiments. Our understanding of the universe relies on the active, integrated participation of theorists in interpreting the results of particle physics experiments.
The Standard Model is our current framework for understanding the strong, electromagnetic and weak interactions. However, the model is not fully understood, leaves many open questions, and cannot explain a number of experimental observations.
The least understood part of the Standard Model is the strong interaction (QCD). The strong coupling strength, particularly at low energies, makes it difficult to make accurate predictions for comparison with experiments. Lattice formulation provides a framework for numerical calculations but is limited by computing capability. It is also necessary to reconcile different fermion formulations. Significant advances have been achieved recently due to improved computing technology and more efficient algorithms, but further improvement will be required to confront future Intensity Frontier experiments. The use of Global Processing Unit (GPU) technology may help by providing an economical way to boost computing power.
At the high-energy end, higher-order perturbative calculations are required for analyzing data from collider experiments. This requires summing a large number of contributions. Significant advances have been made recently, and it appears the trend is toward automation of the calculation. This again requires significant computing capabilities.
Open questions regarding QCD include the strong CP problem and the X, Y, and Z resonances observed by BaBar and Belle (some of these resonances were also seen at the Tevatron).
Other important questions regarding the Standard Model include the underlying mechanism for electroweak symmetry breaking, the origin of the fermion (including neutrinos) mass, and how the matter-antimatter asymmetry came about. Many possible answers have been suggested, but we need experimental data to find out what nature’s answer is.
Another challenge is to incorporate gravity with the Standard Model to obtain a unified description of all the known fundamental forces. String theory appears to be a promising approach, but has not yet lead to a definitive answer. The study of string theory has yielded the possibility of understanding a strongly coupled theory through its holographic dual. Will this lead to a way to better understand QCD?
The discovery of dark matter and dark energy revealed that the Standard Model might be relevant for merely 5% of the universe. The challenge now is to better understand the properties of dark matter and dark energy.
The current types of problems tackled by the High Energy Theory subprogram are likely to be around for a good number of years. Depending on what will be discovered at the Tevatron and the LHC, direction and focus may shift. In the near term, organic growth is expected in the areas of Phenomenology and Model Building and Cosmology and Particle Astrophysics. Advances in lattice QCD will depend on increases in computational power and new algorithms that may simplify the calculations. The recent trend of utilizing GPU may help boost computing power at a relatively low cost.