Do we have enough data? Robust reliability via uncertainty quantification

Roberto Rocchetta, Matteo Broggi, Edoardo Patelli*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

34 Citations (Scopus)
23 Downloads (Pure)

Abstract

A generalised probabilistic framework is proposed for reliability assessment and uncertainty quantification under a lack of data. The developed computational tool allows the effect of epistemic uncertainty to be quantified and has been applied to assess the reliability of an electronic circuit and a power transmission network. The strength and weakness of the proposed approach are illustrated by comparison to traditional probabilistic approaches. In the presence of both aleatory and epistemic uncertainty, classic probabilistic approaches may lead to misleading conclusions and a false sense of confidence which may not fully represent the quality of the available information. In contrast, generalised probabilistic approaches are versatile and powerful when linked to a computational tool that permits their applicability to realistic engineering problems.

Original languageEnglish
Pages (from-to)710-721
Number of pages12
JournalApplied Mathematical Modelling
Volume54
Early online date26 Oct 2017
DOIs
Publication statusPublished - 1 Feb 2018

Keywords

  • computational tool
  • Dempster–Shafer
  • information quality
  • probability boxes
  • reliability
  • uncertainty quantification

Fingerprint

Dive into the research topics of 'Do we have enough data? Robust reliability via uncertainty quantification'. Together they form a unique fingerprint.

Cite this