Clarity, surprises, and further questions in the Article 29 Working Party draft guidance on automated decision-making and profiling

Michael Veale, Lilian Edwards

Research output: Contribution to journalArticle

  • 1 Citations

Abstract

The Article 29 Data Protection Working Party's recent draft guidance on automated decision-making and profiling seeks to clarify European data protection (DP) law's little-used right to prevent automated decision-making, as well as the provisions around profiling more broadly, in the run-up to the General Data Protection Regulation. In this paper, we analyse these new guidelines in the context of recent scholarly debates and technological concerns. They foray into the less-trodden areas of bias and non-discrimination, the significance of advertising, the nature of “solely” automated decisions, impacts upon groups and the inference of special categories of data—at times, appearing more to be making or extending rules than to be interpreting them. At the same time, they provide only partial clarity – and perhaps even some extra confusion – around both the much discussed “right to an explanation” and the apparent prohibition on significant automated decisions concerning children. The Working Party appears to feel less mandated to adjudicate in these conflicts between the recitals and the enacting articles than to explore altogether new avenues. Nevertheless, the directions they choose to explore are particularly important ones for the future governance of machine learning and artificial intelligence in Europe and beyond.
LanguageEnglish
Number of pages7
JournalComputer Law and Security Review
Volume34
Issue number2
Early online date10 Jan 2018
DOIs
StatePublished - 30 Apr 2018

Fingerprint

data protection
Data privacy
Decision making
decision making
artificial intelligence
Artificial intelligence
Learning systems
Marketing
governance
regulation
Law
trend
learning
Profiling
Draft
Guidance
Data protection
Surprise
Group

Keywords

  • automated decision making
  • algorithmic decision making
  • right to an explanation
  • right of access
  • data protection regulation

Cite this

@article{20cea36282c241acb83a3502939f4f80,
title = "Clarity, surprises, and further questions in the Article 29 Working Party draft guidance on automated decision-making and profiling",
abstract = "The Article 29 Data Protection Working Party's recent draft guidance on automated decision-making and profiling seeks to clarify European data protection (DP) law's little-used right to prevent automated decision-making, as well as the provisions around profiling more broadly, in the run-up to the General Data Protection Regulation. In this paper, we analyse these new guidelines in the context of recent scholarly debates and technological concerns. They foray into the less-trodden areas of bias and non-discrimination, the significance of advertising, the nature of “solely” automated decisions, impacts upon groups and the inference of special categories of data—at times, appearing more to be making or extending rules than to be interpreting them. At the same time, they provide only partial clarity – and perhaps even some extra confusion – around both the much discussed “right to an explanation” and the apparent prohibition on significant automated decisions concerning children. The Working Party appears to feel less mandated to adjudicate in these conflicts between the recitals and the enacting articles than to explore altogether new avenues. Nevertheless, the directions they choose to explore are particularly important ones for the future governance of machine learning and artificial intelligence in Europe and beyond.",
keywords = "automated decision making, algorithmic decision making, right to an explanation, right of access, data protection regulation",
author = "Michael Veale and Lilian Edwards",
year = "2018",
month = "4",
day = "30",
doi = "10.1016/j.clsr.2017.12.002",
language = "English",
volume = "34",
journal = "Computer Law and Security Review",
issn = "0267-3649",
number = "2",

}

Clarity, surprises, and further questions in the Article 29 Working Party draft guidance on automated decision-making and profiling. / Veale, Michael; Edwards, Lilian.

In: Computer Law and Security Review, Vol. 34, No. 2, 30.04.2018.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Clarity, surprises, and further questions in the Article 29 Working Party draft guidance on automated decision-making and profiling

AU - Veale,Michael

AU - Edwards,Lilian

PY - 2018/4/30

Y1 - 2018/4/30

N2 - The Article 29 Data Protection Working Party's recent draft guidance on automated decision-making and profiling seeks to clarify European data protection (DP) law's little-used right to prevent automated decision-making, as well as the provisions around profiling more broadly, in the run-up to the General Data Protection Regulation. In this paper, we analyse these new guidelines in the context of recent scholarly debates and technological concerns. They foray into the less-trodden areas of bias and non-discrimination, the significance of advertising, the nature of “solely” automated decisions, impacts upon groups and the inference of special categories of data—at times, appearing more to be making or extending rules than to be interpreting them. At the same time, they provide only partial clarity – and perhaps even some extra confusion – around both the much discussed “right to an explanation” and the apparent prohibition on significant automated decisions concerning children. The Working Party appears to feel less mandated to adjudicate in these conflicts between the recitals and the enacting articles than to explore altogether new avenues. Nevertheless, the directions they choose to explore are particularly important ones for the future governance of machine learning and artificial intelligence in Europe and beyond.

AB - The Article 29 Data Protection Working Party's recent draft guidance on automated decision-making and profiling seeks to clarify European data protection (DP) law's little-used right to prevent automated decision-making, as well as the provisions around profiling more broadly, in the run-up to the General Data Protection Regulation. In this paper, we analyse these new guidelines in the context of recent scholarly debates and technological concerns. They foray into the less-trodden areas of bias and non-discrimination, the significance of advertising, the nature of “solely” automated decisions, impacts upon groups and the inference of special categories of data—at times, appearing more to be making or extending rules than to be interpreting them. At the same time, they provide only partial clarity – and perhaps even some extra confusion – around both the much discussed “right to an explanation” and the apparent prohibition on significant automated decisions concerning children. The Working Party appears to feel less mandated to adjudicate in these conflicts between the recitals and the enacting articles than to explore altogether new avenues. Nevertheless, the directions they choose to explore are particularly important ones for the future governance of machine learning and artificial intelligence in Europe and beyond.

KW - automated decision making

KW - algorithmic decision making

KW - right to an explanation

KW - right of access

KW - data protection regulation

UR - http://www.sciencedirect.com/science/article/pii/S026736491730376X

U2 - 10.1016/j.clsr.2017.12.002

DO - 10.1016/j.clsr.2017.12.002

M3 - Article

VL - 34

JO - Computer Law and Security Review

T2 - Computer Law and Security Review

JF - Computer Law and Security Review

SN - 0267-3649

IS - 2

ER -