Computational particle physics refers to the methods and computing tools developed in and used by
particle physics
Particle physics or high-energy physics is the study of Elementary particle, fundamental particles and fundamental interaction, forces that constitute matter and radiation. The field also studies combinations of elementary particles up to the s ...
research. Like
computational chemistry
Computational chemistry is a branch of chemistry that uses computer simulations to assist in solving chemical problems. It uses methods of theoretical chemistry incorporated into computer programs to calculate the structures and properties of mol ...
or
computational biology
Computational biology refers to the use of techniques in computer science, data analysis, mathematical modeling and Computer simulation, computational simulations to understand biological systems and relationships. An intersection of computer sci ...
, it is, for
particle physics
Particle physics or high-energy physics is the study of Elementary particle, fundamental particles and fundamental interaction, forces that constitute matter and radiation. The field also studies combinations of elementary particles up to the s ...
both a specific branch and an interdisciplinary field relying on computer science, theoretical and experimental particle physics and mathematics.
The main fields of computational particle physics are:
lattice field theory
In physics, lattice field theory is the study of lattice models of quantum field theory. This involves studying field theory on a space or spacetime that has been discretised onto a lattice.
Details
Although most lattice field theories are not ...
(numerical computations),
automatic calculation of particle interaction or decay (computer algebra) and
event generators (stochastic methods).
[https://arxiv.org/abs/1301.1211 ''Computational Particle Physics for Event Generators and Data Analysis'' retrieved 8/24/20][https://www2.ccs.tsukuba.ac.jp/projects/ILFTNet/ ''International research network for computational particle physics'' retrieved 8/24/20]
Computing tools
*
Computer algebra
In mathematics and computer science, computer algebra, also called symbolic computation or algebraic computation, is a scientific area that refers to the study and development of algorithms and software for manipulating expression (mathematics), ...
: Many of the computer algebra languages were developed initially to help particle physics calculations:
Reduce,
Mathematica
Wolfram (previously known as Mathematica and Wolfram Mathematica) is a software system with built-in libraries for several areas of technical computing that allows machine learning, statistics, symbolic computation, data manipulation, network ...
,
Schoonschip
Schoonschip was one of the first computer algebra systems, developed in 1963 by Martinus J. G. Veltman, for use in particle physics.
"Schoonschip" refers to the Dutch expression "schoon schip maken": to make a clean sweep, to clean/clear things u ...
,
Form
Form is the shape, visual appearance, or configuration of an object. In a wider sense, the form is the way something happens.
Form may also refer to:
*Form (document), a document (printed or electronic) with spaces in which to write or enter dat ...
,
GiNaC.
Data Grid The largest planned use of the
grid systems will be for the analysis of the
LHC
The Large Hadron Collider (LHC) is the world's largest and highest-energy particle accelerator. It was built by the European Organization for Nuclear Research (CERN) between 1998 and 2008, in collaboration with over 10,000 scientists, and ...
- produced data. Large software packages have been developed to support this application like the
LHC Computing Grid (LCG) . A similar effort in the wider
e-Science
E-Science or eScience is computationally intensive science that is carried out in highly distributed network environments, or science that uses immense data sets that require grid computing; the term sometimes includes technologies that enable dis ...
community is the
GridPP collaboration, a consortium of particle physicists from UK institutions and CERN.
*
Data Analysis Tools: These tools are motivated by the fact that particle physics experiments and simulations often create large datasets, e.g. see references.
* Software Libraries: Many
software libraries
In computing, a library is a collection of resources that can be leveraged during software development to implement a computer program. Commonly, a library consists of executable code such as compiled functions and classes, or a library can ...
are used for particle physics computations. Also important are packages that simulate particle physics interactions using
Monte Carlo simulation
Monte Carlo methods, or Monte Carlo experiments, are a broad class of computational algorithms that rely on repeated random sampling to obtain numerical results. The underlying concept is to use randomness to solve problems that might be det ...
techniques (i.e. event generators).
*
CompHEP
*
UrQMD
*
APFEL
*
Geant4
History
Particle physics played a role in the early history of the internet; the
World-Wide Web
The World Wide Web (WWW or simply the Web) is an information system that enables Content (media), content sharing over the Internet through user-friendly ways meant to appeal to users beyond Information technology, IT specialists and hobbyis ...
was created by
Tim Berners-Lee
Sir Timothy John Berners-Lee (born 8 June 1955), also known as TimBL, is an English computer scientist best known as the inventor of the World Wide Web, the HTML markup language, the URL system, and HTTP. He is a professorial research fellow a ...
when working at CERN in 1991.
Computer Algebra
Note: This section contains an excerpt from 'Computer Algebra in Particle Physics' by Stefan Weinzierl
Particle physics is an important field of application for computer algebra and exploits the capabilities of Computer Algebra Systems (CAS). This leads to valuable feed-back for the development of CAS. Looking at the history of
computer algebra systems, the first programs date back to the 1960s.
[''Stefan Weinzierl, ]op. cit.
''Op. cit.'' is an abbreviation of the Latin phrase ' or ''opere citato'', meaning "the work cited" or ''in the cited work'', respectively.
Overview
The abbreviation is used in an endnote or footnote to refer the reader to a cited work, standing ...
: pgs 3-5.'' The first systems were almost entirely based on
LISP ("LISt Programming language"). LISP is an
interpreted language
In computer science, an interpreter is a computer program that directly executes instructions written in a programming or scripting language, without requiring them previously to have been compiled into a machine language program. An inter ...
and, as the name already indicates, designed for the manipulation of
lists. Its importance for
symbolic computer programs in the early days has been compared to the importance of FORTRAN for numerical programs in the same period.
Already in this first period, the program
REDUCE had some special features for the application to high energy physics. An exception to the LISP-based programs was
SCHOONSHIP, written in
assembler language
In computing, assembly language (alternatively assembler language or symbolic machine code), often referred to simply as assembly and commonly abbreviated as ASM or asm, is any low-level programming language with a very strong correspondence bet ...
by
Martinus J. G. Veltman and specially designed for applications in particle physics. The use of assembler code lead to an incredible fast program (compared to the interpreted programs at that time) and allowed the calculation of more complex
scattering
In physics, scattering is a wide range of physical processes where moving particles or radiation of some form, such as light or sound, are forced to deviate from a straight trajectory by localized non-uniformities (including particles and radiat ...
processes in high energy physics. It has been claimed the program's importance was recognized in 1998 by awarding the half of the Nobel prize to Veltman.
Also the program
MACSYMA
Macsyma (; "Project MAC's SYmbolic MAnipulator") is one of the oldest general-purpose computer algebra systems still in wide use. It was originally developed from 1968 to 1982 at MIT's Project MAC.
In 1982, Macsyma was licensed to Symbolics and ...
deserves to be mentioned explicitly, since it triggered important development with regard to algorithms. In the 1980s new computer algebra systems started to be written in
C. This enabled the better exploitation of the
resources
''Resource'' refers to all the materials available in our environment which are Technology, technologically accessible, Economics, economically feasible and Culture, culturally Sustainability, sustainable and help us to satisfy our needs and want ...
of the computer (compared to the interpreted language LISP) and at the same time allowed to maintain
portability (which would not have been possible in assembler language). This period marked also the appearance of the first commercial computer algebra system, among which Mathematica and
Maple
''Acer'' is a genus of trees and shrubs commonly known as maples. The genus is placed in the soapberry family Sapindaceae.Stevens, P. F. (2001 onwards). Angiosperm Phylogeny Website. Version 9, June 2008 nd more or less continuously updated si ...
are the best known examples. In addition, a few dedicated programs appeared, an example relevant to particle physics is the program FORM by J. Vermaseren as a (portable) successor to SCHOONSHIP. More recently issues of the
maintainability
Maintainability is the ease of maintaining or providing maintenance for a functioning product or service. Depending on the field, it can have slightly different meanings.
Usage in different fields Engineering
In engineering, maintainability ...
of large projects became more and more important and the overall
programming paradigm
A programming paradigm is a relatively high-level way to conceptualize and structure the implementation of a computer program. A programming language can be classified as supporting one or more paradigms.
Paradigms are separated along and descri ...
a changed from
procedural programming
Procedural programming is a programming paradigm, classified as imperative programming, that involves implementing the behavior of a computer program as Function (computer programming), procedures (a.k.a. functions, subroutines) that call each o ...
to
object-oriented
Object-oriented programming (OOP) is a programming paradigm based on the concept of '' objects''. Objects can contain data (called fields, attributes or properties) and have actions they can perform (called procedures or methods and impleme ...
design. In terms of programming languages this was reflected by a move from C to
C++. Following this change of paradigma, the library GiNaC was developed. The GiNac library allows symbolic calculations in C++.
Code generation for computer algebra can also be used in this area.
Lattice field theory
Lattice field theory
In physics, lattice field theory is the study of lattice models of quantum field theory. This involves studying field theory on a space or spacetime that has been discretised onto a lattice.
Details
Although most lattice field theories are not ...
was created by
Kenneth Wilson in 1974. Simulation techniques were later developed from statistical mechanics.
Since the early 1980s, LQCD researchers have pioneered the use of
massively parallel
Massively parallel is the term for using a large number of computer processors (or separate computers) to simultaneously perform a set of coordinated computations in parallel. GPUs are massively parallel architecture with tens of thousands of ...
computers in large scientific applications, using virtually all available computing systems including traditional main-frames, large
PC clusters, and high-performance systems. In addition, it has also been used as a
benchmark for
high-performance computing
High-performance computing (HPC) is the use of supercomputers and computer clusters to solve advanced computation problems.
Overview
HPC integrates systems administration (including network and security knowledge) and parallel programming into ...
, starting with the IBM
Blue Gene
Blue Gene was an IBM project aimed at designing supercomputers that can reach operating speeds in the petaFLOPS (PFLOPS) range, with relatively low power consumption.
The project created three generations of supercomputers, Blue Gene/L, Blue ...
supercomputer.
Eventually national and regional QCD grids were created: LATFOR (continental Europe), UKQCD and USQCD. The ILDG (International Lattice Data Grid) is an international venture comprising grids from the UK, the US, Australia, Japan and Germany, and was formed in 2002.
See also
*
Les Houches Accords
*
Computational physics
Computational physics is the study and implementation of numerical analysis to solve problems in physics. Historically, computational physics was the first application of modern computers in science, and is now a subset of computational science ...
References
External links
* Brown University
Computational High Energy Physics (CHEP) group page
*
International Research Network for Computational Particle Physics. Center for Computational Sciences,
Univ. of Tsukuba, Japan.
History of computing at CERN
{{DEFAULTSORT:Computational Particle Physics
Computational fields of study