Jump to content

Morris method: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Cydebot (talk | contribs)
m Robot - Speedily moving category Probabilistic complexity theory to Category:Randomized algorithms per CFDS.
m →‎Method's steps: full what? using AWB
Line 22: Line 22:
The method starts by sampling a set of start values within the defined ranges of possible values for all input variables and calculating the subsequent model outcome. The second step changes the values for one variable (all other inputs remaining at their start values) and calculates the resulting change in model outcome compared to the first run. Next, the values for another variable are changed (the previous variable is kept at its changed value and all other ones kept at their start values) and the resulting change in model outcome compared to the second run is calculated. This goes on until all input variables are changed. This procedure is repeated ''r'' times (where ''r'' is usually taken between 5 and 15), each time with a different set of start values, which leads to a number of ''r''(''k''&nbsp;+&nbsp;1) runs, where ''k'' is the number of input variables. Such number is very efficient compared to more demanding methods for [[sensitivity analysis]].<ref name="Campolongo">{{harvnb|Campolongo|2003}}</ref>
The method starts by sampling a set of start values within the defined ranges of possible values for all input variables and calculating the subsequent model outcome. The second step changes the values for one variable (all other inputs remaining at their start values) and calculates the resulting change in model outcome compared to the first run. Next, the values for another variable are changed (the previous variable is kept at its changed value and all other ones kept at their start values) and the resulting change in model outcome compared to the second run is calculated. This goes on until all input variables are changed. This procedure is repeated ''r'' times (where ''r'' is usually taken between 5 and 15), each time with a different set of start values, which leads to a number of ''r''(''k''&nbsp;+&nbsp;1) runs, where ''k'' is the number of input variables. Such number is very efficient compared to more demanding methods for [[sensitivity analysis]].<ref name="Campolongo">{{harvnb|Campolongo|2003}}</ref>


A [[sensitivity analysis]] method widely used to screen factors in models of large dimensionality is the design proposed by Morris.<ref name="Factorial Sampling Plans pg. 33">{{harvnb|Morris|1991}}</ref> The Morris method deals efficiently with models containing hundreds of input factors without relying on strict assumptions about the model, such as for instance additivity or monotonicity of the model input-output relationship. The Morris method is simple to understand and implement, and its results are easily interpreted. Furthermore, it is economic in the sense that it requires a number of model evaluations that is linear in the number of model factors. The method can be regarded as global as the final measure is obtained by averaging a number of local measures (the elementary effects), computed at different points of the input space.<ref name="Sensitivity analysis pg. 1">{{harvnb|Sensitivity analysis|2003}}{{full|date=November 2012}}</ref>
A [[sensitivity analysis]] method widely used to screen factors in models of large dimensionality is the design proposed by Morris.<ref name="Factorial Sampling Plans pg. 33">{{harvnb|Morris|1991}}</ref> The Morris method deals efficiently with models containing hundreds of input factors without relying on strict assumptions about the model, such as for instance additivity or monotonicity of the model input-output relationship. The Morris method is simple to understand and implement, and its results are easily interpreted. Furthermore, it is economic in the sense that it requires a number of model evaluations that is linear in the number of model factors. The method can be regarded as global as the final measure is obtained by averaging a number of local measures (the elementary effects), computed at different points of the input space.<ref name="Sensitivity analysis pg. 1">{{harvnb|Sensitivity analysis|2003}}{{full citation needed|date=November 2012}}</ref>


==See also==
==See also==

Revision as of 17:13, 30 December 2016

In applied statistics, the Morris method for global sensitivity analysis is a so-called one-step-at-a-time method (OAT), meaning that in each run only one input parameter is given a new value. It facilitates a global sensitivity analysis by making a number r of local changes at different points x(1 → r) of the possible range of input values.

Method's details

Elementary effects' distribution

The finite distribution of elementary effects associated with the ith input factor, is obtained by randomly sampling different x from Ω, and is denoted by Fi[1]

Variations

In the original work of Morris the two sensitivity measures proposed were respectively the mean, µ, and the standard deviation, σ, of Fi. However, choosing Morris has the drawback that, if the distribution, Fi , contains negative elements, which occurs when the model is non-monotonic, when computing the mean some effects may cancel each other out. Thus, the measure µ on its own is not reliable for ranking factors in order of importance. It is necessary to consider at the same time the values of µ and σ, as a factor with elementary effects of different signs (that cancel each other out) would have a low value of µ but a considerable value of σ that avoids underestimating the factors The screening exercise importance.[1]

µ*

If the distribution, Fi, contains negative elements, which occurs when the model is non-monotonic, when computing the mean some effects may cancel each other out.When the goal is to rank factors in order of importance by making use of a single sensitivity measure, scientific advice is to use µ∗,which by making use of the absolute value, avoids the occurrence of effects of opposite signs.[1]

In Revised Morris method µ* is used to detect input factors with an important overall influence on the output. σ is used to detect factors involved in interaction with other factors or whose effect is non-linear.[1]

Method's steps

The method starts by sampling a set of start values within the defined ranges of possible values for all input variables and calculating the subsequent model outcome. The second step changes the values for one variable (all other inputs remaining at their start values) and calculates the resulting change in model outcome compared to the first run. Next, the values for another variable are changed (the previous variable is kept at its changed value and all other ones kept at their start values) and the resulting change in model outcome compared to the second run is calculated. This goes on until all input variables are changed. This procedure is repeated r times (where r is usually taken between 5 and 15), each time with a different set of start values, which leads to a number of r(k + 1) runs, where k is the number of input variables. Such number is very efficient compared to more demanding methods for sensitivity analysis.[1]

A sensitivity analysis method widely used to screen factors in models of large dimensionality is the design proposed by Morris.[2] The Morris method deals efficiently with models containing hundreds of input factors without relying on strict assumptions about the model, such as for instance additivity or monotonicity of the model input-output relationship. The Morris method is simple to understand and implement, and its results are easily interpreted. Furthermore, it is economic in the sense that it requires a number of model evaluations that is linear in the number of model factors. The method can be regarded as global as the final measure is obtained by averaging a number of local measures (the elementary effects), computed at different points of the input space.[3]

See also

Notes

  1. ^ a b c d e SENSITIVITY ANALYSIS IN PRACTICE 2004 Cite error: The named reference "Campolongo" was defined multiple times with different content (see the help page).
  2. ^ Morris 1991
  3. ^ Sensitivity analysis 2003[full citation needed]

References

  • Campolongo, F., S. Tarantola and A. Saltelli. (1999). "Tackling quantitatively large dimensionality problems". Computer Physics Communication. 1999: 75–85. Bibcode:1999CoPhC.117...75C. doi:10.1016/S0010-4655(98)00165-9.{{cite journal}}: CS1 maint: multiple names: authors list (link)
  • Morris, M.D. (1991). "Factorial Sampling Plans for Preliminary Computational Experiments" (PDF). Technometrics. 33: 161–174. doi:10.2307/1269043.
  • Campolongo,Cariboni,Saltelli, F., J.and A. (2003). "Sensitivity analysis: the Morris method versus the variance based measures. Submitted to Technometrics" (PDF). {{cite journal}}: Cite journal requires |journal= (help)CS1 maint: multiple names: authors list (link)
  • Andrea Saltelli, Stefano Tarantola,Francesca Campolongo and Marco Ratto (2004). "SENSITIVITY ANALYSIS IN PRACTICE A GUIDE TO ASSESSING SCIENTIFIC MODELS". John Willy & sons ,Ltd: 94–120.{{cite journal}}: CS1 maint: multiple names: authors list (link)