Engineering Statistics
   HOME

TheInfoList



OR:

Engineering statistics combines
engineering Engineering is the use of scientific principles to design and build machines, structures, and other items, including bridges, tunnels, roads, vehicles, and buildings. The discipline of engineering encompasses a broad range of more speciali ...
and
statistics Statistics (from German: '' Statistik'', "description of a state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of data. In applying statistics to a scientific, indust ...
using scientific methods for analyzing data. Engineering statistics involves data concerning
manufacturing Manufacturing is the creation or production of goods with the help of equipment, labor, machines, tools, and chemical or biological processing or formulation. It is the essence of secondary sector of the economy. The term may refer to ...
processes such as: component dimensions,
tolerances Engineering tolerance is the permissible limit or limits of variation in: # a physical dimension; # a measured value or physical property of a material, manufactured object, system, or service; # other measured values (such as temperature, hum ...
, type of material, and fabrication process control. There are many methods used in engineering analysis and they are often displayed as
histograms A histogram is an approximate representation of the distribution of numerical data. The term was first introduced by Karl Pearson. To construct a histogram, the first step is to " bin" (or "bucket") the range of values—that is, divide the en ...
to give a visual of the data as opposed to being just numerical. Examples of methods are: #
Design of Experiments The design of experiments (DOE, DOX, or experimental design) is the design of any task that aims to describe and explain the variation of information under conditions that are hypothesized to reflect the variation. The term is generally associ ...
(DOE) is a methodology for formulating scientific and engineering problems using statistical models. The protocol specifies a randomization procedure for the experiment and specifies the primary data-analysis, particularly in hypothesis testing. In a secondary analysis, the statistical analyst further examines the data to suggest other questions and to help plan future experiments. In engineering applications, the goal is often to optimize a process or product, rather than to subject a scientific hypothesis to test of its predictive adequacy. Box, G. E., Hunter,W.G., Hunter, J.S., Hunter,W.G., "Statistics for Experimenters: Design, Innovation, and Discovery", 2nd Edition, Wiley, 2005, The use of optimal (or near optimal) designs reduces the cost of experimentation. #
Quality control Quality control (QC) is a process by which entities review the quality of all factors involved in production. ISO 9000 defines quality control as "a part of quality management focused on fulfilling quality requirements". This approach place ...
and
process control An industrial process control in continuous production processes is a discipline that uses industrial control systems to achieve a production level of consistency, economy and safety which could not be achieved purely by human manual control. ...
use statistics as a tool to manage conformance to specifications of manufacturing processes and their products. # Time and methods engineering use statistics to study repetitive operations in manufacturing in order to set standards and find optimum (in some sense) manufacturing procedures. #
Reliability engineering Reliability engineering is a sub-discipline of systems engineering that emphasizes the ability of equipment to function without failure. Reliability describes the ability of a system or component to function under stated conditions for a specifie ...
which measures the ability of a system to perform for its intended function (and time) and has tools for improving performance. #
Probabilistic design Probabilistic design is a discipline within engineering design. It deals primarily with the consideration of the effects of random variability upon the performance of an engineering system during the design phase. Typically, these effects are re ...
involving the use of probability in product and system design #
System identification The field of system identification uses statistical methods to build mathematical models of dynamical systems from measured data. System identification also includes the optimal design of experiments for efficiently generating informative dat ...
uses
statistical method Statistics (from German: ''Statistik'', "description of a state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of data. In applying statistics to a scientific, industria ...
s to build
mathematical model A mathematical model is a description of a system using mathematical concepts and language. The process of developing a mathematical model is termed mathematical modeling. Mathematical models are used in the natural sciences (such as physics, ...
s of
dynamical system In mathematics, a dynamical system is a system in which a function describes the time dependence of a point in an ambient space. Examples include the mathematical models that describe the swinging of a clock pendulum, the flow of water i ...
s from measured data. System identification also includes the
optimal Mathematical optimization (alternatively spelled ''optimisation'') or mathematical programming is the selection of a best element, with regard to some criterion, from some set of available alternatives. It is generally divided into two subfi ...
design of experiments The design of experiments (DOE, DOX, or experimental design) is the design of any task that aims to describe and explain the variation of information under conditions that are hypothesized to reflect the variation. The term is generally associ ...
for efficiently generating informative data for fitting such models.


History

Engineering statistics dates back to 1000 B.C. when the
Abacus The abacus (''plural'' abaci or abacuses), also called a counting frame, is a calculating tool which has been used since ancient times. It was used in the ancient Near East, Europe, China, and Russia, centuries before the adoption of the Hi ...
was developed as means to calculate numerical data. In the 1600s, the development of information processing to systematically analyze and process data began. In 1654, the
Slide Rule The slide rule is a mechanical analog computer which is used primarily for multiplication and division, and for functions such as exponents, roots, logarithms, and trigonometry. It is not typically designed for addition or subtraction, which ...
technique was developed b
Robert Bissaker
for advanced data calculations. In 1833, a British mathematician named
Charles Babbage Charles Babbage (; 26 December 1791 – 18 October 1871) was an English polymath. A mathematician, philosopher, inventor and mechanical engineer, Babbage originated the concept of a digital programmable computer. Babbage is considered ...
designed the idea of an automatic computer which inspired developers at
Harvard University Harvard University is a private Ivy League research university in Cambridge, Massachusetts. Founded in 1636 as Harvard College and named for its first benefactor, the Puritan clergyman John Harvard, it is the oldest institution of highe ...
and IBM to design the first mechanical automatic-sequence-controlled calculator called MARK I. The integration of computers and calculators into the industry brought about a more efficient means of analyzing data and the beginning of engineering statistics.


Examples


Factorial Experimental Design

A factorial experiment is one where, contrary to the standard experimental philosophy of changing only one independent variable and holding everything else constant, multiple independent variables are tested at the same time. With this design, statistical engineers can see both the direct effects of one independent variable (
main effect In the design of experiments and analysis of variance, a main effect is the effect of an independent variable on a dependent variable averaged across the levels of any other independent variables. The term is frequently used in the context of facto ...
), as well as potential
interaction effects In statistics, an interaction may arise when considering the relationship among three or more variables, and describes a situation in which the effect of one causal variable on an outcome depends on the state of a second causal variable (that is, ...
that arise when multiple independent variables provide a different result when together than either would on its own.


Six Sigma

Six Sigma is a set of techniques to improve the reliability of a manufacturing process. Ideally, all products will have the exact same specifications equivalent to what was desired, but countless imperfections of real-world manufacturing makes this impossible. The as-built specifications of a product are assumed to be centered around a mean, with each individual product deviating some amount away from that mean in a normal distribution. The goal of Six Sigma is to ensure that the acceptable specification limits are six
standard deviations In statistics, the standard deviation is a measure of the amount of variation or dispersion of a set of values. A low standard deviation indicates that the values tend to be close to the mean (also called the expected value) of the set, while ...
away from the mean of the distribution; in other words, that each step of the manufacturing process has at most a 0.00034% chance of producing a defect.


Notes


References

* * Box, G. E., Hunter,W.G., Hunter, J.S., Hunter,W.G., "Statistics for Experimenters: Design, Innovation, and Discovery", 2nd Edition, Wiley, 2005, * * * *


External links

* {{DEFAULTSORT:Engineering Statistics