New article in Big Data & Society (open access): “Predictive Privacy: Collective Data Protection in the Context of AI and Big Data”

I am excited to announce the release of my latest article on predictive privacy. In this paper, I tackle the challenge posed by big data and artificial intelligence, which allow predictions to be made about individuals based on anonymous data from many people. Going beyond my 2021 article in Ethics & Information Theory, this paper elaborates more closely on the philosophical and legal implications of predictive privacy. It presents a refined definition of the term and argues for the construction of predictive privacy as a new protected good.

Predictive analytics has significant potential for abuse, leading to social inequality, discrimination, and exclusion. Current data protection laws do not adequately address these risks, leaving the use of anonymized mass data largely unregulated. My paper introduces the concept of ‘predictive privacy’ as a data protection approach to counter these risks. Predictive privacy is violated when personal information is predicted without a person’s knowledge or against their will based on the data of many others. I argue that predictive privacy should be recognized as a protected good and propose improvements to data protection regulations. Finally, I highlight the need to regulate ‘prediction power’ - a new manifestation of informational power asymmetry between platform companies and society. If you want to stay ahead of the curve in privacy research and contribute to shaping the future of data protection, click here to access the full article.

Read more about predictive privacy

Download options and bibliographic data

  1. Mühlhoff, Rainer. 2023. „Predictive Privacy: Collective Data Protection in the Context of AI and Big Data“. Big Data & Society, 1–14. doi:10.1177/20539517231166886.

Categories:

Date: