HOME

TheInfoList



OR:

Taguchi methods ( ja, タグチメソッド) are
statistical Statistics (from German: ''Statistik'', "description of a state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of data. In applying statistics to a scientific, industria ...
methods, sometimes called robust design methods, developed by Genichi Taguchi to improve the quality of manufactured goods, and more recently also applied to engineering, biotechnology, marketing and advertising. Professional
statistician A statistician is a person who works with theoretical or applied statistics. The profession exists in both the private and public sectors. It is common to combine statistical knowledge with expertise in other subjects, and statisticians may wor ...
s have welcomed the goals and improvements brought about by Taguchi methods, particularly by Taguchi's development of designs for studying variation, but have criticized the inefficiency of some of Taguchi's proposals.
Taguchi Taguchi (written: lit. "rice field mouth") is a Japanese surname. Notable people with the surname include: *, Japanese speed skater *, Japanese engineer and statistician *, Japanese writer *, Japanese voice actress *, Japanese singer-songwriter, a ...
's work includes three principal contributions to statistics: *A specific
loss function In mathematical optimization and decision theory, a loss function or cost function (sometimes also called an error function) is a function that maps an event or values of one or more variables onto a real number intuitively representing some "cost ...
*The philosophy of ''off-line quality control''; and *Innovations in the design of experiments.


Loss functions


Loss functions in the statistical theory

Traditionally, statistical methods have relied on mean-unbiased estimators of treatment effects: Under the conditions of the Gauss–Markov theorem,
least squares The method of least squares is a standard approach in regression analysis to approximate the solution of overdetermined systems (sets of equations in which there are more equations than unknowns) by minimizing the sum of the squares of the res ...
estimators have minimum variance among all mean-unbiased linear estimators. The emphasis on comparisons of means also draws (limiting) comfort from the
law of large numbers In probability theory, the law of large numbers (LLN) is a theorem that describes the result of performing the same experiment a large number of times. According to the law, the average of the results obtained from a large number of trials shou ...
, according to which the sample means converge to the true mean. Fisher's textbook on the design of experiments emphasized comparisons of treatment means. However, loss functions were avoided by
Ronald A. Fisher Sir Ronald Aylmer Fisher (17 February 1890 – 29 July 1962) was a British polymath who was active as a mathematician, statistician, biologist, geneticist, and academic. For his work in statistics, he has been described as "a genius who a ...
.


Taguchi's use of loss functions

Taguchi knew statistical theory mainly from the followers of
Ronald A. Fisher Sir Ronald Aylmer Fisher (17 February 1890 – 29 July 1962) was a British polymath who was active as a mathematician, statistician, biologist, geneticist, and academic. For his work in statistics, he has been described as "a genius who a ...
, who also avoided
loss function In mathematical optimization and decision theory, a loss function or cost function (sometimes also called an error function) is a function that maps an event or values of one or more variables onto a real number intuitively representing some "cost ...
s. Reacting to Fisher's methods in the design of experiments, Taguchi interpreted Fisher's methods as being adapted for seeking to improve the mean outcome of a process. Indeed, Fisher's work had been largely motivated by programmes to compare agricultural yields under different treatments and blocks, and such experiments were done as part of a long-term programme to improve harvests. However, Taguchi realised that in much industrial production, there is a need to produce an outcome ''on target'', for example, to
machine A machine is a physical system using Power (physics), power to apply Force, forces and control Motion, movement to perform an action. The term is commonly applied to artificial devices, such as those employing engines or motors, but also to na ...
a hole to a specified diameter, or to manufacture a cell to produce a given voltage. He also realised, as had
Walter A. Shewhart Walter Andrew Shewhart (pronounced like "shoe-heart"; March 18, 1891 – March 11, 1967) was an American physicist, engineer and statistician, sometimes known as the ''father of Statistical process control, statistical quality control'' and also ...
and others before him, that excessive variation lay at the root of poor manufactured quality and that reacting to individual items inside and outside specification was counterproductive. He therefore argued that quality engineering should start with an understanding of quality costs in various situations. In much conventional industrial engineering, the quality costs are simply represented by the number of items outside specification multiplied by the cost of rework or scrap. However, Taguchi insisted that manufacturers broaden their horizons to consider ''cost to society''. Though the short-term costs may simply be those of non-conformance, any item manufactured away from nominal would result in some loss to the customer or the wider community through early wear-out; difficulties in interfacing with other parts, themselves probably wide of nominal; or the need to build in safety margins. These losses are externalities and are usually ignored by manufacturers, which are more interested in their private costs than social costs. Such externalities prevent markets from operating efficiently, according to analyses of public economics. Taguchi argued that such losses would inevitably find their way back to the originating corporation (in an effect similar to the tragedy of the commons), and that by working to minimise them, manufacturers would enhance brand reputation, win markets and generate profits. Such losses are, of course, very small when an item is near to negligible.
Donald J. Wheeler Donald J. Wheeler is an American author, statistician and expert in quality control.''A Conversation with Donald J. Wheeler'', by William H. Woodall, Quality Engineering, Vol. 21 pages 357-365, 2009. Wheeler graduated from the University of Texas ...
characterised the region within specification limits as where we ''deny that losses exist''. As we diverge from nominal, losses grow until the point where ''losses are too great to deny'' and the specification limit is drawn. All these losses are, as W. Edwards Deming would describe them, ''unknown and unknowable'', but Taguchi wanted to find a useful way of representing them statistically. Taguchi specified three situations: #Larger the better (for example, agricultural yield); #Smaller the better (for example, carbon dioxide emissions); and #On-target, minimum-variation (for example, a mating part in an assembly). The first two cases are represented by simple monotonic loss functions. In the third case, Taguchi adopted a squared-error loss function for several reasons: *It is the first "symmetric" term in the Taylor series expansion of real analytic loss-functions. *Total loss is measured by the variance. For uncorrelated
random variables A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the po ...
, as variance is additive the total loss is an additive measurement of cost. *The squared-error loss function is widely used in
statistics Statistics (from German language, German: ''wikt:Statistik#German, Statistik'', "description of a State (polity), state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of ...
, following Gauss's use of the squared-error loss function in justifying the method of
least squares The method of least squares is a standard approach in regression analysis to approximate the solution of overdetermined systems (sets of equations in which there are more equations than unknowns) by minimizing the sum of the squares of the res ...
.


Reception of Taguchi's ideas by statisticians

Though many of Taguchi's concerns and conclusions are welcomed by statisticians and economists, some ideas have been especially criticized. For example, Taguchi's recommendation that industrial experiments maximise some ''
signal-to-noise ratio Signal-to-noise ratio (SNR or S/N) is a measure used in science and engineering that compares the level of a desired signal to the level of background noise. SNR is defined as the ratio of signal power to the noise power, often expressed in deci ...
'' (representing the magnitude of the mean of a process compared to its variation) has been criticized.


Off-line quality control


Taguchi's rule for manufacturing

Taguchi realized that the best opportunity to eliminate variation of the final product quality is during the design of a product and its manufacturing process. Consequently, he developed a strategy for quality engineering that can be used in both contexts. The process has three stages: *System design *Parameter (measure) design *Tolerance design


System design

This is design at the conceptual level, involving creativity and innovation.


Parameter design

Once the concept is established, the nominal values of the various dimensions and design parameters need to be set, the detail design phase of conventional engineering. Taguchi's radical insight was that the exact choice of values required is under-specified by the performance requirements of the system. In many circumstances, this allows the parameters to be chosen so as to minimize the effects on performance arising from variation in manufacture, environment and cumulative damage. This is sometimes called
robustification Robustification is a form of optimisation whereby a system is made less sensitive to the effects of random variability, or noise, that is present in that system's input variables and parameters. The process is typically associated with engineering ...
. Robust parameter designs consider controllable and uncontrollable noise variables; they seek to exploit relationships and optimize settings that minimize the effects of the noise variables.


Tolerance design

With a successfully completed ''parameter design'', and an understanding of the effect that the various parameters have on performance, resources can be focused on reducing and controlling variation in the critical few dimensions.


Design of experiments

Taguchi developed his experimental theories independently. Taguchi read works following R. A. Fisher only in 1954.


Outer arrays

Taguchi's designs aimed to allow greater understanding of variation than did many of the traditional designs from the analysis of variance (following Fisher). Taguchi contended that conventional sampling is inadequate here as there is no way of obtaining a random sample of future conditions. In Fisher's design of experiments and analysis of variance, experiments aim to reduce the influence of nuisance factors to allow comparisons of the mean treatment-effects. Variation becomes even more central in Taguchi's thinking. Taguchi proposed extending each experiment with an "outer array" (possibly an orthogonal array); the "outer array" should simulate the random environment in which the product would function. This is an example of judgmental sampling. Many quality specialists have been using "outer arrays". Later innovations in outer arrays resulted in "compounded noise." This involves combining a few noise factors to create two levels in the outer array: First, noise factors that drive output lower, and second, noise factors that drive output higher. "Compounded noise" simulates the extremes of noise variation but uses fewer experimental runs than would previous Taguchi designs.


Management of interactions


Interactions, as treated by Taguchi

Many of the orthogonal arrays that Taguchi has advocated are saturated arrays, allowing no scope for estimation of interactions. This is a continuing topic of controversy. However, this is only true for "control factors" or factors in the "inner array". By combining an inner array of control factors with an outer array of "noise factors", Taguchi's approach provides "full information" on control-by-noise interactions, it is claimed. Taguchi argues that such interactions have the greatest importance in achieving a design that is robust to noise factor variation. The Taguchi approach provides more complete interaction information than typical fractional factorial designs, its adherents claim. *Followers of Taguchi argue that the designs offer rapid results and that interactions can be eliminated by proper choice of quality characteristics. That notwithstanding, a "confirmation experiment" offers protection against any residual interactions. If the quality characteristic represents the energy transformation of the system, then the "likelihood" of control factor-by-control factor interactions is greatly reduced, since "energy" is "additive".


Inefficiencies of Taguchi's designs

* Interactions are part of the real world. In Taguchi's arrays, interactions are confounded and difficult to resolve. Statisticians in
response surface methodology In statistics, response surface methodology (RSM) explores the relationships between several explanatory variables and one or more response variables. The method was introduced by George E. P. Box and K. B. Wilson in 1951. The main idea of RSM ...
(RSM) advocate the "sequential assembly" of designs: In the RSM approach, a screening design is followed by a "follow-up design" that resolves only the confounded interactions judged worth resolution. A second follow-up design may be added (time and resources allowing) to explore possible high-order univariate effects of the remaining variables, as high-order univariate effects are less likely in variables already eliminated for having no linear effect. With the economy of screening designs and the flexibility of follow-up designs, sequential designs have great statistical efficiency. The sequential designs of
response surface methodology In statistics, response surface methodology (RSM) explores the relationships between several explanatory variables and one or more response variables. The method was introduced by George E. P. Box and K. B. Wilson in 1951. The main idea of RSM ...
require far fewer experimental runs than would a sequence of Taguchi's designs. Statisticians have developed designs that enable experiments to use fewer replications (or experimental runs), enabling savings over Taguchi's proposed designs: * * Box, G. E. P. and Draper, Norman. 2007. ''Response Surfaces, Mixtures, and Ridge Analyses'', Second Edition
f ''Empirical Model-Building and Response Surfaces'', 1987 F, or f, is the sixth Letter (alphabet), letter in the Latin alphabet, used in the English alphabet, modern English alphabet, the alphabets of other western European languages and others worldwide. Its name in English is English alphabet#Let ...
Wiley. * * * * * R. H. Hardin and
N. J. A. Sloane __NOTOC__ Neil James Alexander Sloane (born October 10, 1939) is a British-American mathematician. His major contributions are in the fields of combinatorics, error-correcting codes, and sphere packing. Sloane is best known for being the creator a ...

"A New Approach to the Construction of Optimal Designs", ''Journal of Statistical Planning and Inference'', vol. 37, 1993, pp. 339-369
* R. H. Hardin and
N. J. A. Sloane __NOTOC__ Neil James Alexander Sloane (born October 10, 1939) is a British-American mathematician. His major contributions are in the fields of combinatorics, error-correcting codes, and sphere packing. Sloane is best known for being the creator a ...

"Computer-Generated Minimal (and Larger) Response Surface Designs: (I) The Sphere"
* R. H. Hardin and
N. J. A. Sloane __NOTOC__ Neil James Alexander Sloane (born October 10, 1939) is a British-American mathematician. His major contributions are in the fields of combinatorics, error-correcting codes, and sphere packing. Sloane is best known for being the creator a ...

"Computer-Generated Minimal (and Larger) Response Surface Designs: (II) The Cube"
* ** ** Box-Draper, Atkinson-Donev-Tobias, Goos, and Wu-Hamada discuss the sequential assembly of designs.


Assessment

Genichi Taguchi has made valuable contributions to statistics and engineering. His emphasis on ''loss to society'', techniques for investigating variation in experiments, and his overall strategy of system, parameter and tolerance design have been influential in improving manufactured quality worldwide.


See also

* * * * * * * * *


References


Bibliography

* * Box, G. E. P. and Draper, Norman. 2007. ''Response Surfaces, Mixtures, and Ridge Analyses'', Second Edition
f ''Empirical Model-Building and Response Surfaces'', 1987 F, or f, is the sixth Letter (alphabet), letter in the Latin alphabet, used in the English alphabet, modern English alphabet, the alphabets of other western European languages and others worldwide. Its name in English is English alphabet#Let ...
Wiley. * * * * * R. H. Hardin and
N. J. A. Sloane __NOTOC__ Neil James Alexander Sloane (born October 10, 1939) is a British-American mathematician. His major contributions are in the fields of combinatorics, error-correcting codes, and sphere packing. Sloane is best known for being the creator a ...

"A New Approach to the Construction of Optimal Designs", ''Journal of Statistical Planning and Inference'', vol. 37, 1993, pp. 339-369
* R. H. Hardin and
N. J. A. Sloane __NOTOC__ Neil James Alexander Sloane (born October 10, 1939) is a British-American mathematician. His major contributions are in the fields of combinatorics, error-correcting codes, and sphere packing. Sloane is best known for being the creator a ...

"Computer-Generated Minimal (and Larger) Response Surface Designs: (I) The Sphere"
* R. H. Hardin and
N. J. A. Sloane __NOTOC__ Neil James Alexander Sloane (born October 10, 1939) is a British-American mathematician. His major contributions are in the fields of combinatorics, error-correcting codes, and sphere packing. Sloane is best known for being the creator a ...

"Computer-Generated Minimal (and Larger) Response Surface Designs: (II) The Cube"
* ** ** * * Moen, R D; Nolan, T W & Provost, L P (1991) ''Improving Quality Through Planned Experimentation'' * *Bagchi Tapan P and Madhuranjan Kumar (1992) ''Multiple Criteria Robust Design of Electronic Devices'', Journal of Electronic Manufacturing, vol 3(1), pp. 31–38 * * Montgomery, D. C. ''Ch. 9'', 6th Edition
f ''Design and Analysis of Experiments'', 2005 F, or f, is the sixth letter in the Latin alphabet, used in the modern English alphabet, the alphabets of other western European languages and others worldwide. Its name in English is ''ef'' (pronounced ), and the plural is ''efs''. Hist ...
Wiley. {{DEFAULTSORT:Taguchi Methods Manufacturing Quality Quality control Systems engineering Design of experiments Japanese inventions