TY - JOUR
T1 - How do "technical" design choices made when building algorithmic decision-making tools for criminal justice authorities create constitutional dangers?
T2 - (Part I)
AU - Yeung, Karen
AU - Harkens, Adam
N1 - This is a pre-copyedited, author-produced version of an article accepted for publication in Public LAw following peer review. The definitive published version (eung, K & Harkens, A 2023, 'How do "technical" design choices made when building algorithmic decision-making tools for criminal justice authorities create constitutional dangers? (Part I)', Public Law, vol. 2023, no. April, pp. 265-286) is available online on Westlaw UK.
PY - 2023/4/30
Y1 - 2023/4/30
N2 - This two-part paper argues that seemingly 'technical' choices made by developers of machine-learning based algorithmic tools used to inform decisions by criminal justice authorities can create serious constitutional dangers, enhancing the likelihood of abuse of decision-making power and the scope and magnitude of injustice. Drawing on three algorithmic tools in use, or recently used, to assess the 'risk' posed by individuals to inform how they should be treated by criminal justice authorities, we integrate insights from data science and public law scholarship to show how public law principles and more specific legal duties that are rooted in these principles, are routinely overlooked in algorithmic tool-building and implementation. We argue that technical developers must collaborate closely with public law experts to ensure that if algorithmic decision-support tools are to inform criminal justice decisions, those tools are configured and implemented in a manner that is demonstrably compliant with public law principles and doctrine, including respect for human rights, throughout the tool-building process.
AB - This two-part paper argues that seemingly 'technical' choices made by developers of machine-learning based algorithmic tools used to inform decisions by criminal justice authorities can create serious constitutional dangers, enhancing the likelihood of abuse of decision-making power and the scope and magnitude of injustice. Drawing on three algorithmic tools in use, or recently used, to assess the 'risk' posed by individuals to inform how they should be treated by criminal justice authorities, we integrate insights from data science and public law scholarship to show how public law principles and more specific legal duties that are rooted in these principles, are routinely overlooked in algorithmic tool-building and implementation. We argue that technical developers must collaborate closely with public law experts to ensure that if algorithmic decision-support tools are to inform criminal justice decisions, those tools are configured and implemented in a manner that is demonstrably compliant with public law principles and doctrine, including respect for human rights, throughout the tool-building process.
KW - algorithms
KW - risk assessment
KW - constitutional principles
KW - administrative law
KW - human rights
UR - https://www.sweetandmaxwell.co.uk/Product/Administrative-Law/Public-Law/Journal/30791427
M3 - Article
SN - 0033-3565
VL - 2023
SP - 265
EP - 286
JO - Public Law
JF - Public Law
IS - April
ER -