Automating identification of potentially problematic privacy policies

Research output: Contribution to journalArticlepeer-review

109 Downloads (Pure)


Almost every website, mobile application or cloud service requires users to agree to a privacy policy, or similar terms of service, detailing how the developer or service provider will handle user data, and the purposes for which it will be used. Many past works have criticised these documents on account of their length, excessively complex wording, or the simple fact that users typically do not read or understand them, and that potentially invasive or wide-reaching terms are included in these policies. In this paper, an automated approach and tool to gather and analyse these policies is presented, and some important considerations for these documents are highlighted, specifically those surrounding past legal rulings over the enforceability of some specific and widely-used contract terms - the ability for terms to be changed without directly notifying users (and presumed continued use indicates acceptance), and the protections in place in the event of a sale or acquisition of a company. The concerns these pose to user privacy and choice are highlighted, as well as the extent to which these terms are found in policies and documents from many popular websites. This tool was used to highlight how commonly these terms are found, and the extent of this potential problem, and explore potential solutions to the challenge of regulating user privacy via such contracts in an era where mobile devices contain significant quantities of highly sensitive personal data, which is highly desirable to service operators, as a core valuation asset of their company.
Original languageEnglish
JournalNordic and Baltic Journal of Information and Communications Technologies
Publication statusAccepted/In press - 22 Feb 2016


  • automating identification
  • cloud services
  • cloud computing
  • privacy policies
  • personal data


Dive into the research topics of 'Automating identification of potentially problematic privacy policies'. Together they form a unique fingerprint.

Cite this