“Whoever controls the media, controls the mind”
There seems to be close agreement that in the New Digital World, Data is the new Gold (also referred as the new Oil, or the new Black Gold), and Data Mining (Analytics) the new Gold Rush. In the future the capability to refine huge quantities of data –analysis capabilities and algorithms able to turn the raw data into meaningful and actionable information– will be a key differentiator:
The amount of data in our world has been exploding, and analyzing large data sets—so-called big data—will become a key basis of competition, underpinning new waves of productivity growth, innovation, and consumer surplus (McKinsey Global Institute: Big data: The next frontier for innovation, competition, and productivity)
Small companies, large companies, entrepreneurs, investors, executives, professors, politicians, even journalists; everybody seems to have a good reason to flock to this new big data trend. Large services companies in particular have large reserves of customer data, most of which is untapped today. For example, Mobile Network Operators have plenty of mobile data which presents itself as an opportunity to turn into products and services that business and consumer customers will value, driving a new wave of growth.
In sharp contrast to what happened during the gold rush or in the oil business, tapping into the data flow seems within hand’s reach. You need not travel miles away in search of a new Yukon River or be the fortunate owner of a piece of land sitting on top of an oil pool to exploit data. You only need an internet connection and a bit of curiosity. Data is everywhere in the Digital World, like oxygen in air. So you can go around and start drilling and refining data and selling your data based product. There are plenty of possibilities and clearly the sky is the limit. You can just start, can’t you?
Well, perhaps you should first take into account a minor detail: customers themselves. Who owns their data? How to use it? What laws and jurisdictions rule over it? For all the simplicity that accessing, gathering and processing data, represents today, we must be aware that we are dealing with very sensitive material: personal data. If you are your data in this New Digital World, you are lending yourself, or at least a part of you, to every service provider you decide to engage with. I would not like to qualify or compare this kind of relationship, but we all feel it is not the same kind of relationship you have with the grocery store where you buy your fruit. If you are your data, you must also be aware that you are protected by the same kind of laws and technologies that define and enforce intellectual Property (Pamela Samuelson, “Privacy as Intellectual Property?”). They are all designed to break a very delicate trade-off between data protection and exploitation, and they are both technically and legally fragile.
Many people consider that disclosure of personal data is increasingly a part of modern life. In Europe, 74% of Europeans see disclosing personal information as necessary to access online services, specifically social networking and sharing sites (61%) and online shopping (79%). However, 72% of internet users in Europe still worry that they are being asked for too much personal data online (Eurobarometer: “Attitudes on Data Protection and Electronic Identity in the European Union”) A recent study on Search Engine Use by Pew Research shows that even though online Americans are more satisfied than ever with the performance of search engines, strong majorities have negative views of personalized search results and targeted ads.
Users feel they are not in control of their data. Therefore, putting the customer in control is a key priority for regulators (New European Regulation on Data Protection and Obama blueprint for a Consumer Privacy Bill of Rights) and increasingly, if controversially, for companies who plan to live on customer data.
Control Mantra: Giving control and transparency to users over their personal data will increase their trust on services and service providers. Trust will motivate users to accept new ways of data sharing while reducing churn and reputational risks
The study reveals that increasing awareness by itself does not represent an increased value to users, nor result necessarily in higher confidence in the service. Quite the contrary, awareness can increase the perception of risk when users can’t answer basic questions raised by what they see, and it lowers trust when they discover unexpected uses of their information. In addition, control can increase the perception of work. When using a service, users look for maximal benefit, and they don’t want to spend time and effort managing data related to the service. For the average user managing privacy data is particularly challenging. Even when presented with answers, most users don’t have the basic technical knowledge to interpret them.
Users perceive extra value and higher trust, only if they clearly understand the causes, consequences and compensation of allowing the use of their data. Users are more willing to share data when they expect to receive free or discounted services, receive a faster delivery of service, increase social reputation or socialize and connect. On the other hand users are more reluctant to share data whenever they perceive a risk to their personal or group intimacy, economic welfare, physical safety or social reputation. Cause and effect must be clear to the user so he or she can recognize the value of personalized content. The perception of risk the user might feel could be overcome by a perception of high value. Should users think the benefit is worth the chance, they might ignore the perceived risk, or could willingly adopt behaviours that are perceived as risky.
There is an additional finding which poses what I consider a particularly relevant challenge, especially in the light of the new European Data Protection Regulation put forward by the Commission. One of the Commission’s proposals obliges companies and data controllers organizations to notify data breaches without undue delay (which, where feasible, should be within 24 hours) to both data protection authorities and the individuals concerned. This sounds sensible, but what users want is to feel and know that their data is safe when using digital services. And whenever unexpected consequences arise, users should be able not only to understand them but to know what they should do in order to correct them, no matter their skill level. All in all, it is probably fair to say that it is all about convenience, all about the old customer value proposition.
Other studies present an even gloomier perspective over the real possibility to grant users an effective control over their information as a way to create a more balanced and productive online economy. Asymmetric information, bounded rationality and cognitive and behavioural biases plague the field of information privacy and make it very difficult to analyse and design general information privacy policies. Experimental literature suggests that individual responsibility is not sufficient, and that the choice and notification approach is inadequate for privacy protection. Alessandro Acquisti and coworkers at Carnegie Mellon University, speak about the control paradox:
Control paradox: Individuals who perceive more control over the release of private information are likely to pay less attention to the accessibility, and consequent usage of the information by others – leading in some cases to increased dissemination of potentially harmful information. Paradoxically, therefore, through a mechanism that runs from perceived control to individual behaviour to consequences, technologies that make individuals feel more in control over the release of personal information can increase their likelihood of suffering harms as a result of disclosures (L. Brandimarte, A. Acquisti, G. Loewenstein “Misplaced Confidences: Privacy and the Control Paradox”)
The control paradox opens a can of worms, because it might be used by paternalistic governments to justify the design and enforcement of policies to automatically prevent people from taking decisions or a specific course of action in the name of their own security (The Economist, “The Avuncular State”)
The zeitgeist is haunting the internet, blowing a ravenous feeling of urgency: Carpe Diem! forget about the past and the future. We may forget them, but our past, present and future are bound by our data trails. In his book “Delete“, Viktor Mayer-Schönberger explains how in the 1930’s the Dutch government put in place a population registry which contained names, birth dates, addresses, religions, etc. The registry was hailed as facilitating government administration and improving welfare planning. When the Nazis invaded the Netherlands, they took possession of the registry. Thanks to the information contained there, the Nazis were able to identify, prosecute and murder a much higher percentage of Jews and Gypsies than in other European countries. The Dutch provided the information trusting their government, in the same way that we today provide information to our Digital Services Providers trusting them. However the Dutch did not foresee the looming threat of the Nazi invasion, and what their data in the government registry would mean then.
I am today an eager user of the internet and all its beautiful and useful, if sometimes wild, services. I walk my way and lend my data with generosity in return for its sometimes incredible and sometimes stupid services. I try and take my risks. Doing that is today my job and it is also part of my life. But I have very present the words of Cardinal Richelieu:
Give me six lines written by the hand of the most honest man, and I would find something in them to have him hanged