Enslaving the algorithm: from a 'right to an explanation' to a 'right to better decisions'?

Lilian Edwards, Michael Veale

Research output: Contribution to journalArticle

5 Citations (Scopus)

Abstract

As concerns about unfairness and discrimination in “black box” machine learning systems rise, a legal “right to an explanation” has emerged as a compellingly attractive approach for challenge and redress. We outline recent debates on the limited provisions in European data protection law, and introduce and analyse newer explanation rights in French administrative law and the draft modernised Council of Europe Convention 108. While individual rights can be useful, in privacy law they have historically unreasonably burdened the average data subject. “Meaningful information” about algorithmic logics is more technically possible than commonly thought, but this exacerbates a new “transparency fallacy”—an illusion of remedy rather than anything substantively helpful. While rights-based approaches deserve a firm place in the toolbox, other forms of governance, such as impact assessments, “soft law”, judicial review and model
repositories deserve more attention, alongside catalysing agencies acting for users to control algorithmic system design.
LanguageEnglish
Pages46-54
Number of pages9
JournalIEEE Security and Privacy Magazine
Volume16
Issue number3
Early online date25 Jun 2018
DOIs
Publication statusE-pub ahead of print - 25 Jun 2018

Fingerprint

Learning systems
privacy law
data protection
Council of Europe
administrative law
Data privacy
Law
control system
Transparency
remedies
transparency
discrimination
Systems analysis
governance
firm
learning

Keywords

  • algorithmic decision making
  • algorithmic logic
  • systems design
  • legal rights
  • data protection
  • privacy law

Cite this

@article{c315463d7e5140a981a7b68f3936b1d6,
title = "Enslaving the algorithm: from a 'right to an explanation' to a 'right to better decisions'?",
abstract = "As concerns about unfairness and discrimination in “black box” machine learning systems rise, a legal “right to an explanation” has emerged as a compellingly attractive approach for challenge and redress. We outline recent debates on the limited provisions in European data protection law, and introduce and analyse newer explanation rights in French administrative law and the draft modernised Council of Europe Convention 108. While individual rights can be useful, in privacy law they have historically unreasonably burdened the average data subject. “Meaningful information” about algorithmic logics is more technically possible than commonly thought, but this exacerbates a new “transparency fallacy”—an illusion of remedy rather than anything substantively helpful. While rights-based approaches deserve a firm place in the toolbox, other forms of governance, such as impact assessments, “soft law”, judicial review and modelrepositories deserve more attention, alongside catalysing agencies acting for users to control algorithmic system design.",
keywords = "algorithmic decision making, algorithmic logic, systems design, legal rights, data protection, privacy law",
author = "Lilian Edwards and Michael Veale",
year = "2018",
month = "6",
day = "25",
doi = "10.1109/MSP.2018.2701152",
language = "English",
volume = "16",
pages = "46--54",
journal = "IEEE Security and Privacy Magazine",
issn = "1540-7993",
publisher = "IEEE",
number = "3",

}

Enslaving the algorithm : from a 'right to an explanation' to a 'right to better decisions'? / Edwards, Lilian; Veale, Michael.

In: IEEE Security and Privacy Magazine, Vol. 16, No. 3, 25.06.2018, p. 46-54.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Enslaving the algorithm

T2 - IEEE Security and Privacy Magazine

AU - Edwards, Lilian

AU - Veale, Michael

PY - 2018/6/25

Y1 - 2018/6/25

N2 - As concerns about unfairness and discrimination in “black box” machine learning systems rise, a legal “right to an explanation” has emerged as a compellingly attractive approach for challenge and redress. We outline recent debates on the limited provisions in European data protection law, and introduce and analyse newer explanation rights in French administrative law and the draft modernised Council of Europe Convention 108. While individual rights can be useful, in privacy law they have historically unreasonably burdened the average data subject. “Meaningful information” about algorithmic logics is more technically possible than commonly thought, but this exacerbates a new “transparency fallacy”—an illusion of remedy rather than anything substantively helpful. While rights-based approaches deserve a firm place in the toolbox, other forms of governance, such as impact assessments, “soft law”, judicial review and modelrepositories deserve more attention, alongside catalysing agencies acting for users to control algorithmic system design.

AB - As concerns about unfairness and discrimination in “black box” machine learning systems rise, a legal “right to an explanation” has emerged as a compellingly attractive approach for challenge and redress. We outline recent debates on the limited provisions in European data protection law, and introduce and analyse newer explanation rights in French administrative law and the draft modernised Council of Europe Convention 108. While individual rights can be useful, in privacy law they have historically unreasonably burdened the average data subject. “Meaningful information” about algorithmic logics is more technically possible than commonly thought, but this exacerbates a new “transparency fallacy”—an illusion of remedy rather than anything substantively helpful. While rights-based approaches deserve a firm place in the toolbox, other forms of governance, such as impact assessments, “soft law”, judicial review and modelrepositories deserve more attention, alongside catalysing agencies acting for users to control algorithmic system design.

KW - algorithmic decision making

KW - algorithmic logic

KW - systems design

KW - legal rights

KW - data protection

KW - privacy law

UR - https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=8013

U2 - 10.1109/MSP.2018.2701152

DO - 10.1109/MSP.2018.2701152

M3 - Article

VL - 16

SP - 46

EP - 54

JO - IEEE Security and Privacy Magazine

JF - IEEE Security and Privacy Magazine

SN - 1540-7993

IS - 3

ER -