Jump to content

Six Sigma

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by 66.61.41.107 (talk) at 06:20, 15 September 2005 (External links). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

For the 1960s rhythm and blues band, see Sigma 6

Six Sigma is a quality management program to achieve "six sigma" levels of quality. It was pioneered by Motorola in the mid-1980s by Bob Galvin, who succeeded his father and the founder of Motorola as head of the company, Paul Galvin, and by Motorola engineer Bill Smith. It has since spread to many other manufacturing companies, including GE, Honeywell, Raytheon, Seagate Technology, and Microsoft. However, it can be applied wherever the control of variation is desired. In recent years, it has begun to branch out into the service industry, and in 2000, Fort Wayne, Indiana became the first city to implement the program in a city government. Some, claiming that Six Sigma's impact has not yet been fully realized, advocate an open source approach so that the principles of Six Sigma might be more widely adopted.

In statistics, sigma refers to the standard deviation of a set of data; "six sigma", therefore, refers to six standard deviations. Mathematically, assuming that defects occur according to a standard normal distribution, this corresponds to approximately two quality failures per billion parts manufactured. In practical application of the Six Sigma methodology, however, the rate is taken to be 3.4 per million; see below. Initially, many believed that such high process reliability was impossible, and three sigmas (67,000 defects per million opportunities, or DPMO) was considered acceptable. However, market leaders have measurably reached six sigmas in numerous processes.

Why six?

According to the graph of the standard normal distribution, only two billionths of the normal curve falls beyond six standard deviations, in contrast to the value of 3.4 millionths publicized by Six Sigma promoters. Confusingly, that value corresponds to precision within 4.5 standard deviations, reflecting a 1.5 standard deviation "shift". This shift is used to account for model inaccuracies, since defects in manufacturing processes do not always correspond to the normal distribution. Instead, processes tend to "drift" with time, causing the majority of error to fall on one side of the normal distribution and as a result, a higher defect rate than 3.4 DPMO if no shift were used. With Six Sigma methodology, however, if the process drifts by 1.5 standard deviations, the level of quality will remain within 3.4 DPMO.

However, the 1.5 sigma shift assumption is not without its critics. Donald J. Wheeler, a respected quality professional, labels it "goofy", arguing that it is misapplied in practice and that it is probably inaccurate anyway. Often, implementers of Six Sigma simply add 1.5 "sigmas" to their sigma calculation, transforming a 4.5 sigma process (3.4 DPMO) into a 6.0 sigma process. But this reflects a misunderstanding of the nature of the shift. If short-term data is used (data that does not reflect potential process drift), 1.5 sigmas should be subtracted from the final sigma calculation to account for the potential drift. Thus, achieving 3.4 DPMO using short term data reflects a three sigma process, not six sigma, when used to reflect the long-term failure rate. Alternatively, if long-term data is used to make the sigma calculations, the process drift will have already been accounted for, and no additions or subtractions to the sigma calculation are necessary.

The other common objection is that the choice of a shift of 1.5 sigmas is too arbitrary and probably inaccurate. Some suggest that the 1.5 sigma shift was implemented for marketing reasons, so that the program could be named Six Sigma instead of "4.5 Sigma" without setting the unrealistic goal of two defects per billion. However, according to original training material used at Motorola in 1985, the point at which a shift became detectable with a sample size of 4 was 1.5 standard deviations, suggesting that the number was not arbitrarily selected.

In practice, the principle of six standard deviations of quality between the upper and lower specification limits is often not applied with mathematical rigor. Instead, Six Sigma is seen as a methodology to generally minimize defects. It is used in this way in non-manufacturing environments, where it serves as an analogy to manufacturing processes and is not used for statistical distributions. Similarly, the frequent misuse of the 1.5 shift assumption in manufacturing processes is a reflection of a similar attitude in industrial applications as well.

The ±1.5 sigma drift

Everyone with a Six Sigma program knows about the ±1.5 sigma drift of a process mean, experienced by all processes. What this is saying is that if we are manufacturing a product that is 100 ± 3 cm (97 - 103cm), over time, it may drift down to 98.5 – 104.5 or up to 95.5 - 101.5. Something that might be of concern to our customers. So where does the "±1.5" come from?

The ±1.5 shift was introduced by Mikel Harry as most people are aware. Where did he get it? Harry refers to a paper written in 1975 by Evans, "Statistical Tolerancing: The State of the Art. Part 3. Shifts and Drifts". The paper is about tolerancing. That is how the overall error in an assembly is affected by the errors in components. Evans refers to a paper by Bender in 1962, "Benderizing Tolerances – A Simple Practical Probablity Method for Handling Tolerances for Limit Stack Ups". He looked at the classical situation with a stack of disks and how the overall error in the size of the stack, relates to errors in the individual disks. Based on "probability, approximations and experience", he suggests:

v = 1.5*SQRT(var X)

What has this got to do with monitoring the myriad of processes that people are concerned about? Very little. Harry then takes things a step further. Imagine a process where 5 samples are taken every half hour and plotted on a control chart. Harry considered the "instantaneous" initial 5 samples as being "short term" (Harry’s n=5) and the samples throughout the day as being "long term" (Harry’s g=50 points). Because of random variation in the first 5 points, the mean of the initial sample is different to the overall mean. Harry derived a relationship between the short term and long term capability, using the equation above, to produce a capability shift or "Z shift" of 1.5 ! Over time, the original meaning of "short term" and "long term" has been changed to result in "long term" drifting means.

Harry has clung tenaciously to the "1.5" but over the years, its derivation has been modified. In a recent note from Harry "We employed the value of 1.5 since no other empirical information was available at the time of reporting." In other words, 1.5 has now become an empirical rather than theoretical value. A further softening from Harry: "… the 1.5 constant would not be needed as an approximation".

Despite this, industry has fixed on the idea that it is impossible to keep processes on target. No matter what is done, process means will drift by ±1.5 sigma. In other words, suppose a process has a target value of 10.0, and control limits work out to be, say, 13.0 and 7.0. "Long term" the mean will drift to 11.5 (or 8.5), with control limits changing to 14.5 and 8.5. This is nonsense.

The simple truth is that any process where the mean changes by 1.5 sigma or any other amount, is not in statistical control. Such a change can often be detected by a trend on a control chart. A process that is not in control is not predictable. It may begin to produce defects, no matter where specification limits have been set.

World Class Quality means "On target with minimum variation" .

Basic methodologies

DMAIC

Basic methodology to improve existing processes

  • Define out of tolerance range.
  • Measure key internal processes critical to quality.
  • Analyze why defects occur and explore opportunity for improvement.
  • Improve the process to stay within tolerance.
  • Control the process to stay within goals.

DMADV

Basic methodology of introducing new processes.

  • Define the process and where it would fail to meet customer needs.
  • Measure and determine if process meets customer needs.
  • Analyze the options to meet customer needs.
  • Design in changes to the process to meet customers needs.
  • Verify the changes have met customer needs.

See also