TY - JOUR
T1 - Insights into the quantification and reporting of model-related uncertainty across different disciplines
AU - Simmonds, Emily G.
AU - Adjei, Kwaku Peprah
AU - Andersen, Christoffer Wold
AU - Aspheim, Janne Cathrin Hetle
AU - Battistin, Claudia
AU - Bulso, Nicola
AU - Christensen, Hannah
AU - Cretois, Benjamin
AU - Cubero, Ryan
AU - Davidovich, Iván A.
AU - Dickel, Lisa
AU - Dunn, Benjamin
AU - Dunn-Sigouin, Etienne
AU - Dyrstad, Karin
AU - Einum, Sigurd
AU - Giglio, Donata
AU - Gjerløw, Haakon
AU - Godefroidt, Amélie
AU - González-Gil, Ricardo
AU - Cogno, Soledad Gonzalo
AU - Große, Fabian
AU - Halloran, Paul
AU - Jensen, Mari F.
AU - Kennedy, John James
AU - Langsæther, Peter Egge
AU - Laverick, Jack
AU - Lederberger, Debora
AU - Li, Camille
AU - Mandeville, Caitlin
AU - Mandeville, Elizabeth
AU - Moe, Espen
AU - Schröder, Tobias Navarro
AU - Nunan, David
AU - Parada, Jorge Sicacha
AU - Simpson, Melanie Rae
AU - Skarstein, Emma Sofie
AU - Spensberger, Clemens
AU - Stevens, Richard
AU - Subramanian, Aneesh
AU - Svendsen, Lea
AU - Theisen, Ole Magnus
AU - Watret, Connor
AU - O’Hara, Robert B.
PY - 2022/12/31
Y1 - 2022/12/31
N2 - Quantifying uncertainty associated with our models is the only way we can express how much we know about any phenomenon. Incomplete consideration of model-based uncertainties can lead to overstated conclusions with real world impacts in diverse spheres, including conservation, epidemiology, climate science, and policy. Despite these potentially damaging consequences, we still know little about how different fields quantify and report uncertainty. We introduce the "sources of uncertainty" framework, using it to conduct a systematic audit of model-related uncertainty quantification from seven scientific fields, spanning the biological, physical, and political sciences. Our interdisciplinary audit shows no field fully considers all possible sources of uncertainty, but each has its own best practices alongside shared outstanding challenges. We make ten easy-to-implement recommendations to improve the consistency, completeness, and clarity of reporting on model-related uncertainty. These recommendations serve as a guide to best practices across scientific fields and expand our toolbox for high-quality research.
AB - Quantifying uncertainty associated with our models is the only way we can express how much we know about any phenomenon. Incomplete consideration of model-based uncertainties can lead to overstated conclusions with real world impacts in diverse spheres, including conservation, epidemiology, climate science, and policy. Despite these potentially damaging consequences, we still know little about how different fields quantify and report uncertainty. We introduce the "sources of uncertainty" framework, using it to conduct a systematic audit of model-related uncertainty quantification from seven scientific fields, spanning the biological, physical, and political sciences. Our interdisciplinary audit shows no field fully considers all possible sources of uncertainty, but each has its own best practices alongside shared outstanding challenges. We make ten easy-to-implement recommendations to improve the consistency, completeness, and clarity of reporting on model-related uncertainty. These recommendations serve as a guide to best practices across scientific fields and expand our toolbox for high-quality research.
KW - uncertainty
KW - modelling
KW - statistical
KW - policy
KW - interdisciplinary
UR - https://www.cell.com/iscience/home
U2 - 10.1016/j.isci.2022.105512
DO - 10.1016/j.isci.2022.105512
M3 - Article
SN - 2589-0042
VL - 25
JO - iScience
JF - iScience
IS - 12
M1 - 105512
ER -