Pricing privacy – the right to know the value of your personal data
Section snippets
Introduction: from passive defence to active empowerment
The commodification of digital identities is an emerging reality in the data-driven economy.1 Personal data of individuals represent monetary value in the data-driven economy and are often considered as a counter performance for “free”
De facto monetisation of personal data already at stake
The monetization of personal data is already a reality in nearly all fields of the digital market. The European Commission has highlighted that the market for consumers' data is growing fast and business models based on monetizing data become predominant20
Savings
Consumers are encouraged to disclose their personal data by a discount covering a part or the totality of the price.
Earnings
Consumers are encouraged to disclose their personal data via a monetary benefit (e.g. a digital wallet).27
A counter-service
In particular personalization: consumers are encouraged to disclose their personal data by a more tailored service, e.g., a personalized search engine or a personalized social network platform. In some cases, the online services offered may lose some functionality when they cannot be personalized.
No incentives
None of the above incentives applies. In these cases, often consumers have an all-or-nothing choice when disclosing their personal data.
Combining this transaction structure classification with the
Quantifying the value of personal data
When asking how much a person's data is worth, the answer is not much. General information about a person, such as age, gender and location is worth a mere 0.05 cent. Persons who are shopping for a car, a financial product or a vacation are more valuable to companies that want to pitch those goods. For instance, personal data of auto buyers are worth about 0.21 cent per person.34 Personal data of people
“Active choice” models and the GDPR
There are several ways to increasing consumers' awareness about monetisation of personal data in the modern information society. It may be suggested that there are better alternatives for a right to know the value of your personal data. Particularly so-called “active choice” models are often mentioned in this respect.64 These models refer to an
Problems of pricing privacy
In this section, we discuss some problems of the idea to introduce the right to know the value of our own personal data in EU data protection law. In Section 5.1 we discuss practical problems, in Section 5.2 we discuss broader moral problems and in Section 5.3 we discuss cognitive problems.
Conclusions
In this paper, we analysed whether consumers/users should have a right to know the value of their personal data. The main reason to consider this question is because, on the one hand, the commodification of digital identities is an emerging reality in the data-driven economy and, on the other hand, individuals do not seem to be fully aware of the monetary value of their personal data. They tend to underestimate their economic power within the data-driven economy and to passively succumb to the
Author information
Gianclaudio Malgieri, LLM is a PhD Researcher at the Law, Science, Technology and Society studies (LSTS) of Vrije Universiteit Brussel, Belgium. Email Address: [email protected]. Bart Custers PhD MSc LLM is associate professor and head of research at eLaw, the Center for Law and Digital Technologies at the Faculty of Law of Leiden University, the Netherlands.
References (0)
Cited by (92)
Monetary valuation of personal health data in the wild
2024, International Journal of Human Computer StudiesA valorization framework to strategically manage data for creating competitive value
2024, International Journal of Production EconomicsA privacy scoring framework: Automation of privacy compliance and risk evaluation with standard indicators
2023, Journal of King Saud University - Computer and Information SciencesFairDEA—Removing disparate impact from efficiency scores
2022, European Journal of Operational ResearchCitation Excerpt :Furthermore, unfairness is often regarded in terms of a sensitive attribute. Although there are legal documents such as the Civil Rights Act of 1964 (Furnish, 1981), which forbids discrimination based on sensitive attributes (e.g. race, sex, and religion), and GDPR (General Data Protection Regulation) right to know how a decision was made (Malgieri & Custers, 2018), algorithmic decision-making tools are still fairness-blind. One common algorithmic decision-making tool for assessing efficiency is DEA, a linear optimization method that measures the relative efficiency of DMUs.
The GDPR enshrines the right to the impersonal price
2022, Computer Law and Security Review