Statistical Process Control
   HOME

TheInfoList



OR:

Statistical process control (SPC) or statistical quality control (SQC) is the application of statistical methods to monitor and control the quality of a production process. This helps to ensure that the process operates efficiently, producing more specification-conforming products with less waste (rework or
scrap Scrap consists of Recycling, recyclable materials, usually metals, left over from product manufacturing and consumption, such as parts of vehicles, building supplies, and surplus materials. Unlike waste, scrap Waste valorization, has monetary ...
). SPC can be applied to any process where the "conforming product" (product meeting specifications) output can be measured. Key tools used in SPC include run charts, control charts, a focus on continuous improvement, and the design of experiments. An example of a process where SPC is applied is manufacturing lines. SPC must be practiced in two phases: The first phase is the initial establishment of the process, and the second phase is the regular production use of the process. In the second phase, a decision of the period to be examined must be made, depending upon the change in 5M&E conditions (Man, Machine, Material, Method, Movement, Environment) and wear rate of parts used in the manufacturing process (machine parts, jigs, and fixtures). An advantage of SPC over other methods of quality control, such as "
inspection An inspection is, most generally, an organized examination or formal evaluation exercise. In engineering activities inspection involves the measurements, tests, and gauges applied to certain characteristics in regard to an object or activity. ...
," is that it emphasizes early detection and prevention of problems, rather than the correction of problems after they have occurred. In addition to reducing waste, SPC can lead to a reduction in the time required to produce the product. SPC makes it less likely the finished product will need to be reworked or scrapped.


History

Statistical process control was pioneered by
Walter A. Shewhart Walter Andrew Shewhart (pronounced like "shoe-heart"; March 18, 1891 – March 11, 1967) was an American physicist, engineer and statistician, sometimes known as the ''father of Statistical process control, statistical quality control'' and also ...
at Bell Laboratories in the early 1920s. Shewhart developed the control chart in 1924 and the concept of a state of statistical control. Statistical control is equivalent to the concept of exchangeability developed by logician William Ernest Johnson also in 1924 in his book ''Logic, Part III: The Logical Foundations of Science''. Along with a team at AT&T that included Harold Dodge and Harry Romig he worked to put sampling inspection on a rational statistical basis as well. Shewhart consulted with Colonel Leslie E. Simon in the application of control charts to munitions manufacture at the Army's Picatinny Arsenal in 1934. That successful application helped convince Army Ordnance to engage AT&T's George Edwards to consult on the use of statistical quality control among its divisions and contractors at the outbreak of World War II. W. Edwards Deming invited Shewhart to speak at the Graduate School of the U.S. Department of Agriculture and served as the editor of Shewhart's book ''Statistical Method from the Viewpoint of Quality Control'' (1939), which was the result of that lecture. Deming was an important architect of the quality control short courses that trained American industry in the new techniques during WWII. The graduates of these wartime courses formed a new professional society in 1945, the American Society for Quality Control, which elected Edwards as its first president. Deming travelled to Japan during the Allied Occupation and met with the Union of Japanese Scientists and Engineers (JUSE) in an effort to introduce SPC methods to Japanese industry .


'Common' and 'special' sources of variation

Shewhart read the new statistical theories coming out of Britain, especially the work of William Sealy Gosset,
Karl Pearson Karl Pearson (; born Carl Pearson; 27 March 1857 – 27 April 1936) was an English mathematician and biostatistician. He has been credited with establishing the discipline of mathematical statistics. He founded the world's first university st ...
, and Ronald Fisher. However, he understood that data from physical processes seldom produced a normal distribution curve (that is, a Gaussian distribution or ' bell curve'). He discovered that data from measurements of variation in manufacturing did not always behave the same way as data from measurements of natural phenomena (for example, Brownian motion of particles). Shewhart concluded that while every process displays variation, some processes display variation that is natural to the process ("''common''" sources of variation); these processes he described as being ''in (statistical) control''. Other processes additionally display variation that is not present in the causal system of the process at all times ("''special''" sources of variation), which Shewhart described as ''not in control''.


Application to non-manufacturing processes

Statistical process control is appropriate to support any repetitive process, and has been implemented in many settings where for example ISO 9000 quality management systems are used, including financial auditing and accounting, IT operations, health care processes, and clerical processes such as loan arrangement and administration, customer billing etc. Despite criticism of its use in design and development, it is well-placed to manage semi-automated data governance of high-volume data processing operations, for example in an enterprise data warehouse, or an enterprise data quality management system. In the 1988
Capability Maturity Model The Capability Maturity Model (CMM) is a development model created in 1986 after a study of data collected from organizations that contracted with the U.S. Department of Defense, who funded the research. The term "maturity" relates to the degree of ...
(CMM) the Software Engineering Institute suggested that SPC could be applied to software engineering processes. The Level 4 and Level 5 practices of the Capability Maturity Model Integration ( CMMI) use this concept. The application of SPC to non-repetitive, knowledge-intensive processes, such as research and development or systems engineering, has encountered skepticism and remains controversial. In ''No Silver Bullet'', Fred Brooks points out that the complexity, conformance requirements, changeability, and invisibility of softwareFred P. Brooks (1986) No Silver Bullet — Essence and Accident in Software Engineering, Proceedings of the IFIP Tenth World Computing Conference 1986, pp. 1069–1076 results in inherent and essential variation that cannot be removed. This implies that SPC is less effective in the software development than in, e.g., manufacturing.


Variation in manufacturing

In manufacturing, quality is defined as conformance to specification. However, no two products or characteristics are ever exactly the same, because any process contains many sources of variability. In mass-manufacturing, traditionally, the quality of a finished article is ensured by post-manufacturing inspection of the product. Each article (or a sample of articles from a production lot) may be accepted or rejected according to how well it meets its design specifications, SPC uses
statistical Statistics (from German: ''Statistik'', "description of a state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of data. In applying statistics to a scientific, industria ...
tools to observe the performance of the production process in order to detect significant variations before they result in the production of a sub-standard article. Any source of variation at any point of time in a process will fall into one of two classes. ;(1) ''Common'' causes: 'Common' causes are sometimes referred to as 'non-assignable', or 'normal' sources of variation. It refers to any source of variation that consistently acts on process, of which there are typically many. This type of causes collectively produce a statistically stable and repeatable distribution over time. ;(2) ''Special'' causes: 'Special' causes are sometimes referred to as 'assignable' sources of variation. The term refers to any factor causing variation that affects only some of the process output. They are often intermittent and unpredictable. Most processes have many sources of variation; most of them are minor and may be ignored. If the dominant assignable sources of variation are detected, potentially they can be identified and removed. When they are removed, the process is said to be 'stable'. When a process is stable, its variation should remain within a known set of limits. That is, at least, until another assignable source of variation occurs. For example, a breakfast cereal packaging line may be designed to fill each cereal box with 500 grams of cereal. Some boxes will have slightly more than 500 grams, and some will have slightly less. When the package weights are measured, the data will demonstrate a
distribution Distribution may refer to: Mathematics *Distribution (mathematics), generalized functions used to formulate solutions of partial differential equations *Probability distribution, the probability of a particular value or value range of a varia ...
of net weights. If the production process, its inputs, or its environment (for example, the machine on the line) change, the distribution of the data will change. For example, as the cams and pulleys of the machinery wear, the cereal filling machine may put more than the specified amount of cereal into each box. Although this might benefit the customer, from the manufacturer's point of view it is wasteful, and increases the cost of production. If the manufacturer finds the change and its source in a timely manner, the change can be corrected (for example, the cams and pulleys replaced). From an SPC perspective, if the weight of each cereal box varies randomly, some higher and some lower, always within an acceptable range, then the process is considered stable. If the cams and pulleys of the machinery start to wear out, the weights of the cereal box might not be random. The degraded functionality of the cams and pulleys may lead to a non-random linear pattern of increasing cereal box weights. We call this common cause variation. If, however, all the cereal boxes suddenly weighed much more than average because of an unexpected malfunction of the cams and pulleys, this would be considered a special cause variation.


Application

The application of SPC involves three main phases of activity: #Understanding the process and the specification limits. #Eliminating assignable (special) sources of variation, so that the process is stable. #Monitoring the ongoing production process, assisted by the use of control charts, to detect significant changes of mean or variation.


Control charts

The data from measurements of variations at points on the process map is monitored using
control charts Control charts is a graph used in production control to determine whether quality and manufacturing processes are being controlled under stable conditions. (ISO 7870-1) The hourly status is arranged on the graph, and the occurrence of abnormalit ...
. Control charts attempt to differentiate "assignable" ("special") sources of variation from "common" sources. "Common" sources, because they are an expected part of the process, are of much less concern to the manufacturer than "assignable" sources. Using control charts is a continuous activity, ongoing over time.


Stable process

When the process does not trigger any of the control chart "detection rules" for the control chart, it is said to be "stable". A process capability analysis may be performed on a stable process to predict the ability of the process to produce "conforming product" in the future. A stable process can be demonstrated by a process signature that is free of variances outside of the capability index. A process signature is the plotted points compared with the capability index.


Excessive variations

When the process triggers any of the control chart "detection rules", (or alternatively, the process capability is low), other activities may be performed to identify the source of the excessive variation. The tools used in these extra activities include: Ishikawa diagram, designed experiments, and Pareto charts. Designed experiments are a means of objectively quantifying the relative importance (strength) of sources of variation. Once the sources of (special cause) variation are identified, they can be minimized or eliminated. Steps to eliminating a source of variation might include: development of standards, staff training, error-proofing, and changes to the process itself or its inputs.


Process stability metrics

When monitoring many processes with control charts, it is sometimes useful to calculate quantitative measures of the stability of the processes. These metrics can then be used to identify/prioritize the processes that are most in need of corrective actions. These metrics can also be viewed as supplementing the traditional process capability metrics. Several metrics have been proposed, as described in Ramirez and Runger. They are (1) a Stability Ratio which compares the long-term variability to the short-term variability, (2) an ANOVA Test which compares the within-subgroup variation to the between-subgroup variation, and (3) an Instability Ratio which compares the number of subgroups that have one or more violations of the
Western Electric rules The Western Electric rules are decision rules in statistical process control for detecting out-of-control or non- random conditions on control charts. Locations of the observations relative to the control chart control limits (typically at ±3 s ...
to the total number of subgroups.


Mathematics of control charts

Digital control charts use logic-based rules that determine "derived values" which signal the need for correction. For example, :derived value = last value + average absolute difference between the last N numbers.


See also

* ANOVA Gauge R&R * Distribution-free control chart *
Electronic design automation Electronic design automation (EDA), also referred to as electronic computer-aided design (ECAD), is a category of software tools for designing Electronics, electronic systems such as integrated circuits and printed circuit boards. The tools wo ...
* Industrial engineering * Process Window Index * Process capability index *
Quality assurance Quality assurance (QA) is the term used in both manufacturing and service industries to describe the systematic efforts taken to ensure that the product(s) delivered to customer(s) meet with the contractual and other agreed upon performance, design ...
*
Reliability engineering Reliability engineering is a sub-discipline of systems engineering that emphasizes the ability of equipment to function without failure. Reliability describes the ability of a system or component to function under stated conditions for a specifie ...
* Six sigma *
Stochastic control Stochastic control or stochastic optimal control is a sub field of control theory that deals with the existence of uncertainty either in observations or in the noise that drives the evolution of the system. The system designer assumes, in a Bayes ...
* Total quality management


References


Bibliography

*Barlow, R. E. & Irony, T. Z. (1992) "Foundations of statistical quality control" in Ghosh, M. & Pathak, P.K. (eds.) ''Current Issues in Statistical Inference: Essays in Honor of D. Basu'', Hayward, CA: Institute of Mathematical Statistics, 99–112. *Bergman, B. (2009) "Conceptualistic Pragmatism: A framework for Bayesian analysis?", ''IIE Transactions'', 41, 86–93 *Deming, W E (1975) "On probability as a basis for action", ''The American Statistician'', 29(4), 146–152 *— (1982) ''Out of the Crisis: Quality, Productivity and Competitive Position'' *Grant, E. L. (1946
Statistical quality control
*Oakland, J (2002) ''Statistical Process Control'' *Salacinski, T (2015) ''SPC - Statistical Process Control''. The Warsaw University of Technology Publishing House. *Shewhart, W A (1931) ''Economic Control of Quality of Manufactured Product'' *— (1939) ''Statistical Method from the Viewpoint of Quality Control'' * *Wheeler, D J (2000) ''Normality and the Process-Behaviour Chart'' *Wheeler, D J & Chambers, D S (1992) ''Understanding Statistical Process Control'' *Wheeler, Donald J. (1999). ''Understanding Variation: The Key to Managing Chaos - 2nd Edition''. SPC Press, Inc. . *Wise, Stephen A. & Fair, Douglas C (1998). ''Innovative Control Charting: Practical SPC Solutions for Today's Manufacturing Environment.'' ASQ Quality Press. *


External links


MIT Course - Control of Manufacturing Processes
{{DEFAULTSORT:Statistical Process Control