Taguchi methods

From Wikipedia, the free encyclopedia
Jump to: navigation, search

Taguchi methods (Japanese: タグチメソッド) are statistical methods, or sometimes called robust design methods, developed by Genichi Taguchi to improve the quality of manufactured goods, and more recently also applied to engineering,[1] biotechnology,[2][3] marketing and advertising.[4] Professional statisticians have welcomed the goals and improvements brought about by Taguchi methods,[editorializing] particularly by Taguchi's development of designs for studying variation, but have criticized the inefficiency of some of Taguchi's proposals.[5]

Taguchi's work includes three principal contributions to statistics:

Loss functions[edit]

Loss functions in the statistical theory[edit]

Traditionally, statistical methods have relied on mean-unbiased estimators of treatment effects: Under the conditions of the Gauss-Markov theorem, least squares estimators have minimum variance among all mean-unbiased estimators. The emphasis on comparisons of means also draws (limiting) comfort from the law of large numbers, according to which the sample means converge to the true mean. Fisher's textbook on the design of experiments emphasized comparisons of treatment means.

However, loss functions were avoided by Ronald A. Fisher.[6]

Taguchi's use of loss functions[edit]

Taguchi knew statistical theory mainly from the followers of Ronald A. Fisher, who also avoided loss functions. Reacting to Fisher's methods in the design of experiments, Taguchi interpreted Fisher's methods as being adapted for seeking to improve the mean outcome of a process. Indeed, Fisher's work had been largely motivated by programmes to compare agricultural yields under different treatments and blocks, and such experiments were done as part of a long-term programme to improve harvests.

However, Taguchi realised that in much industrial production, there is a need to produce an outcome on target, for example, to machine a hole to a specified diameter, or to manufacture a cell to produce a given voltage. He also realised, as had Walter A. Shewhart and others before him, that excessive variation lay at the root of poor manufactured quality and that reacting to individual items inside and outside specification was counterproductive.

He therefore argued that quality engineering should start with an understanding of quality costs in various situations. In much conventional industrial engineering, the quality costs are simply represented by the number of items outside specification multiplied by the cost of rework or scrap. However, Taguchi insisted that manufacturers broaden their horizons to consider cost to society. Though the short-term costs may simply be those of non-conformance, any item manufactured away from nominal would result in some loss to the customer or the wider community through early wear-out; difficulties in interfacing with other parts, themselves probably wide of nominal; or the need to build in safety margins. These losses are externalities and are usually ignored by manufacturers, which are more interested in their private costs than social costs. Such externalities prevent markets from operating efficiently, according to analyses of public economics. Taguchi argued that such losses would inevitably find their way back to the originating corporation (in an effect similar to the tragedy of the commons), and that by working to minimise them, manufacturers would enhance brand reputation, win markets and generate profits.

Such losses are, of course, very small when an item is near to negligible. Donald J. Wheeler characterised the region within specification limits as where we deny that losses exist. As we diverge from nominal, losses grow until the point where losses are too great to deny and the specification limit is drawn. All these losses are, as W. Edwards Deming would describe them, unknown and unknowable, but Taguchi wanted to find a useful way of representing them statistically. Taguchi specified three situations:

  1. Larger the better (for example, agricultural yield);
  2. Smaller the better (for example, carbon dioxide emissions); and
  3. On-target, minimum-variation (for example, a mating part in an assembly).

The first two cases are represented by simple monotonic loss functions. In the third case, Taguchi adopted a squared-error loss function for several reasons:

Reception of Taguchi's ideas by statisticians[edit]

Though many of Taguchi's concerns and conclusions are welcomed by statisticians and economists, some ideas have been especially criticized. For example, Taguchi's recommendation that industrial experiments maximise some signal-to-noise ratio (representing the magnitude of the mean of a process compared to its variation) has been criticized widely.[citation needed]

Off-line quality control[edit]

Taguchi's rule for manufacturing[edit]

Taguchi realized that the best opportunity to eliminate variation of the final product quality is during the design of a product and its manufacturing process. Consequently, he developed a strategy for quality engineering that can be used in both contexts. The process has three stages:

  • System design
  • Parameter (measure) design
  • Tolerance design

System design[edit]

This is design at the conceptual level, involving creativity and innovation.

Parameter design[edit]

Once the concept is established, the nominal values of the various dimensions and design parameters need to be set, the detail design phase of conventional engineering. Taguchi's radical insight was that the exact choice of values required is under-specified by the performance requirements of the system. In many circumstances, this allows the parameters to be chosen so as to minimize the effects on performance arising from variation in manufacture, environment and cumulative damage. This is sometimes called robustification.

Robust parameter designs consider controllable and uncontrollable noise variables; they seek to exploit relationships and optimize settings that minimize the effects of the noise variables.

Tolerance design[edit]

Main article: Pareto principle

With a successfully completed parameter design, and an understanding of the effect that the various parameters have on performance, resources can be focused on reducing and controlling variation in the critical few dimensions.

Design of experiments[edit]

Taguchi developed his experimental theories independently. Taguchi read works following R. A. Fisher only in 1954. Taguchi's framework for design of experiments is idiosyncratic and often flawed, but contains much that is of enormous value.[citation needed] He made a number of innovations.

Outer arrays[edit]

Taguchi's designs aimed to allow greater understanding of variation than did many of the traditional designs from the analysis of variance (following Fisher). Taguchi contended that conventional sampling is inadequate here as there is no way of obtaining a random sample of future conditions.[7] In Fisher's design of experiments and analysis of variance, experiments aim to reduce the influence of nuisance factors to allow comparisons of the mean treatment-effects. Variation becomes even more central in Taguchi's thinking.

Taguchi proposed extending each experiment with an "outer array" (possibly an orthogonal array); the "outer array" should simulate the random environment in which the product would function. This is an example of judgmental sampling. Many quality specialists have been using "outer arrays".

Later innovations in outer arrays resulted in "compounded noise." This involves combining a few noise factors to create two levels in the outer array: First, noise factors that drive output lower, and second, noise factors that drive output higher. "Compounded noise" simulates the extremes of noise variation but uses fewer experimental runs than would previous Taguchi designs.

Management of interactions[edit]

Interactions, as treated by Taguchi[edit]

Many of the orthogonal arrays that Taguchi has advocated are saturated arrays, allowing no scope for estimation of interactions. This is a continuing topic of controversy. However, this is only true for "control factors" or factors in the "inner array". By combining an inner array of control factors with an outer array of "noise factors", Taguchi's approach provides "full information" on control-by-noise interactions, it is claimed. Taguchi argues that such interactions have the greatest importance in achieving a design that is robust to noise factor variation. The Taguchi approach provides more complete interaction information than typical fractional factorial designs, its adherents claim.

  • Followers of Taguchi argue that the designs offer rapid results and that interactions can be eliminated by proper choice of quality characteristics. That notwithstanding, a "confirmation experiment" offers protection against any residual interactions. If the quality characteristic represents the energy transformation of the system, then the "likelihood" of control factor-by-control factor interactions is greatly reduced, since "energy" is "additive".

Inefficencies of Taguchi's designs[edit]

  • Interactions are part of the real world. In Taguchi's arrays, interactions are confounded and difficult to resolve.

Statisticians in response surface methodology (RSM) advocate the "sequential assembly" of designs: In the RSM approach, a screening design is followed by a "follow-up design" that resolves only the confounded interactions judged worth resolution. A second follow-up design may be added (time and resources allowing) to explore possible high-order univariate effects of the remaining variables, as high-order univariate effects are less likely in variables already eliminated for having no linear effect. With the economy of screening designs and the flexibility of follow-up designs, sequential designs have great statistical efficiency. The sequential designs of response surface methodology require far fewer experimental runs than would a sequence of Taguchi's designs.[8]

Analysis of experiments[edit]


Genichi Taguchi has made valuable contributions to statistics and engineering. His emphasis on loss to society, techniques for investigating variation in experiments, and his overall strategy of system, parameter and tolerance design have been influential in improving manufactured quality worldwide.[9] Although some of the statistical aspects of the Taguchi methods are disputable, there is no dispute that they are widely applied to various processes. A quick search in related journals, as well as the World Wide Web, reveals that the method is being successfully implemented in diverse areas, such as the design of VLSI; optimization of communication & information networks, development of electronic circuits, laser engraving of photo masks, cash-flow optimization in banking, government policymaking, runway utilization improvement in airports, and even robust eco-design.[10]

See also[edit]


  1. ^ Rosa, Jorge Luiz; Robin, Alain; Silva, M. B.; Baldan, Carlos Alberto; Peres, Mauro Pedro. "Electrodeposition of copper on titanium wires: Taguchi experimental design approach". Journal of Materials Processing Technology. 209: 1181–1188. doi:10.1016/j.jmatprotec.2008.03.021. 
  2. ^ Rao, Ravella Sreenivas; C. Ganesh Kumar; R. Shetty Prakasham; Phil J. Hobbs (March 2008). "The Taguchi methodology as a statistical tool for biotechnological applications: A critical appraisal". Biotechnology Journal. 3 (4): 510–523. doi:10.1002/biot.200700201. PMID 18320563. Retrieved 2009-04-01. 
  3. ^ Rao, R. Sreenivas; R.S. Prakasham, K. Krishna Prasad, S. Rajesham, P.N. Sarma, L. Venkateswar Rao (April 2004). "Xylitol production by Candida sp.: parameter optimization using Taguchi approach". Process Biochemistry. 39 (8): 951–956. doi:10.1016/S0032-9592(03)00207-3.  Cite uses deprecated parameter |coauthors= (help)
  4. ^ Selden, Paul H. (1997). Sales Process Engineering: A Personal Workshop. Milwaukee, Wisconsin: ASQ Quality Press. p. 237. ISBN 0-87389-418-9. 
  5. ^ Professional statisticians have welcomed Taguchi's concerns and emphasis on understanding variation (and not just the mean): Of course, these statisticians celebrate the achievements of Taguchi, the Edward Deming of Japan, whose books and trans-Atlantic visits helped industrial leaders appreciate the role of statistical methods in total quality management. That said, professional statisticians have criticized some of Taguchi's designs, as being less efficient than the traditional designs or optimal designs of response surface methodology. At the same time, the industrial adaption of even inefficient Taguchi designs demonstrated a market for response-surface methodology, which had been neglected by statistical researchers and textbooks. Taguchi's successes forced many improvements on statistical textbooks, which had to become more accessible to industrial practitioners.
  6. ^ In fact, Fisher labelled loss functions as being better suited for American businessmen and Soviet comisars than for empirical scientists (in Fisher's 1956 attack on Wald in the 1956 JRSS).
  7. ^ Similar truisms about the problem of induction had been voiced by Hume and (more recently) by W. Edwards Deming in his discussion of analytic studies.
  8. ^ Statisticians have developed designs that enable experiments to use fewer replications (or experimental runs), enabling savings over Taguchi's proposed designs: Box-Draper, Atkinson-Donev-Tobias, Goos, and Wu-Hamada discuss the sequential assembly of designs.
  9. ^ Ben-Gal I. (2005), "On the Use of Data Compression Measures to Assess Robust Designs", IEEE Trans. on Reliability, Vol. 54, no. 3, 381-388. Available at: http://www.eng.tau.ac.il/~bengal/Journal%20Paper.pdf
  10. ^ Ben-Gal I., Katz R. and Bukchin J., "Robust Eco-Design: A New Application for Quality Engineering", IIE Transactions, Vol. 40 (10), p. 907 - 918. Available at: http://www.eng.tau.ac.il/~bengal/Eco_Design.pdf