The Morris method, often referred to in the context of sensitivity analysis, is a technique used to determine the significance of input variables on the output of a model. It is particularly useful in situations where the model is complex and the relationship between inputs and outputs may not be linear or straightforward. Developed by M. D. Morris in the 1990s, the method aims to assess how the uncertainty in the input variables contributes to the uncertainty in the model output.

Articles by others on the same topic (0)

There are currently no matching articles.