Research Output per year
A new validation metric is proposed that combines the use of a threshold based on the uncertainty in the measurement data with a normalised relative error, and that is robust in the presence of large variations in the data. The outcome from the metric is the probability that a model's predictions are representative of the real world based on the specific conditions and confidence level pertaining to the experiment from which the measurements were acquired. Relative error metrics are traditionally designed for use with series of data values but orthogonal decomposition has been employed to reduce the dimensionality of data matrices to feature vectors so that the metric can be applied to fields of data. Three previously published case studies are employed to demonstrate the efficacy of this quantitative approach to the validation process in the discipline of structural analysis, for which historical data was available; however, the concept could be applied to a wide range of disciplines and sectors where modelling and simulation plays a pivotal role.
- relative error
- model validation
- computational modelling
- orthogonal decomposition
Patelli, E. & Patterson, E. A., 4 Dec 2019, In : Royal Society Open Science. 6, 12, 1 p., 191986.
Research output: Contribution to journal › Comment/debate
Dvurecenska, K., Graham, S., Patelli, E., & Patterson, E. A. (2018). A probabilistic metric for the validation of computational models. Royal Society Open Science, 5(11). https://doi.org/10.1098/rsos.180687