“Congressman, in general we collect data of people who have not signed up for Facebook”
April 11th, 2018.
Ben Ray Luján (D-NM) coined the term “Shadow Profile” to describe this controversial business practice. It is the collection and use of personal data without consent. Many internet users have long known that, in exchange for free and valuable online services, tech companies will use their personal information for profit. For many companies, however, the consent required to use data as they see fit is often a barrier to getting things done. This situation fosters expedited or omitted consent. Crucially, data breaches can lead to real harm – re-identification, identify theft and social engineering attacks represent very real threats to personal security. This information is a valuable resource, and as tech knowledge diffuses into legislatures, lawmakers are working to strike a fair balance between consumer protections and the potential of big data.
General Data Protection Regulation
Increasing concern for one’s personal data has led to increasingly restrictive privacy laws in many countries. In May 2018, the European Union will begin enforcing its General Data Protection Regulation (GDPR), a set of laws that many believe will set the tone for consumer data protections worldwide. Alongside new guidelines concerning consent, usage, and data breach notification, the GDPR introduces a set of privacy rights for its citizens. Headlining this group is “The Right to Be Forgotten”, which is accompanied by rights to object to certain forms of processing, challenge decisions concerning you that are made by algorithm, and to receive a copy of all the personal data that an organization has on you. Compliance is mandatory; companies found to have violated these rights face large penalties: up to 20 million Euros or 4% of global revenue, whichever is higher.
For multinational corporations, compliance became a priority in a hurry. The EU is a huge market, and many industries use personal data; from the software superpower to the lowly importer/exporter. Unsurprisingly, IT governance spending has spiked in the years since GDPR was first proposed in 2012. Cataloguing data stored by an organization can be costly and difficult, depending on how large the data footprint is and how many lawyers are involved. Many believe that the effort is worth it. Customer perceptions of due diligence surrounding privacy are quickly becoming integrated into the value of a brand; this belief is reflected in Facebook’s reactionary offering of personalized data profiles to users following the Cambridge Analytica scandal (Interestingly, non-users with ‘shadow profiles’ would be required to create a Facebook account to receive this).
Are these laws necessary? Are the valuable? As a privacy professional profiting from this culture shift, I’m inclined to say ‘Yes’. Mortgage-driven bias aside, such questions inspire me to share the story of the Andrew Pole to illustrate the potential for harm (Charles Duhigg of the New York Times tells this story better). Briefly, Mr. Pole worked for data science shop within Target in 2010, where his team created a predictive analytics engine designed to predict events in the lives of customers based on purchasing habits. Metrics like ‘pregnancy score’ were calculated to evaluate how likely such an event was to occur, while Target’s marketing department would use this information to send along custom coupons designed to disrupt habitual purchasing patterns in a profitable way.
It was quickly discovered that this approach made customers uncomfortable. In an interview, Mr. Pole tells a story of how an angry father complained to the US retailer about pregnancy-themed coupons addressed to his daughter. The story describes an angry man shouting “Are you encouraging my teenaged daughter to get pregnant?”. Weeks later, the same manager who soothed the father with apologies and assurances received an apology of his own; the father had since learned from his daughter that she was, in fact, pregnant.
While the personalized coupon was a novelty in 2010, this marketing technique is commonplace today, and has led to greater public awareness of the value of information privacy. Seemingly unimportant bits of data can be combined to identify someone – or something about someone – leading people to make inferences, draw conclusions, then to act in ways they otherwise wouldn’t have without this information. Under new EU law, the young woman would have had to opt-in, that is, give active consent, before being included in such schemes. Protections offered by EU decree might have prevented the unexpected and uncomfortable conversation with her father, or worse.
Some have expressed concerns that these new restrictions will have a chilling effect on entrepreneurship. While measures to comply with the new laws will draw funding away from other priorities, small companies have the advantage of a small data footprint relative to that of large multinationals, affording the opportunity to incorporate privacy into their operations from the outset. It could even be an opportunity to set oneself apart from established players; privacy remains important to consumers, and industry polling suggests that only a fraction of companies will be compliant with GDPR when it becomes enforceable on May 25th.
Regardless, it is likely to remain an uphill battle for startups in the age of privacy. Market research suggests that buyers are more likely to trust their personal data to well-known companies. This perception could stem from the belief that bigger brands have the resources to implement the necessary controls. It might also stem from a belief in social liability, as social media campaigns have shamed more than a few giants into heavily publicized and costly reactions that were intended to restore reputation. These challenges are winnable by would-be entrepreneurs, but trustworthiness must be signaled to customers to avoid sharing a strata of creditability with those two guys selling stereo equipment from the back of a van.
The Future of Privacy
For law makers in the EU, there remains some uncertainty surrounding enforcement beyond the EU’s shores. Countries with lax laws and little incentive to limit indigenous processing of EU data must be coerced into cooperation. Moreover, the rise of block chain presents problems, as the technology requires complete data sets to be replicated and distributed across multiple sites, frustrating those who would track and control it. In contrast, opportunities exist in the widespread adoption of privacy controls. Healthcare in particular stands to benefit, as de-identification services have emerged to protect doctor-patient confidentiality norms dating back to antiquity. Analysts are then able to draw insights from rich hospital databases that were until now, restricted.
Privacy is an important value within democratic societies, now more than ever. Having something to hide or not is immaterial. Privacy affords us a freedom of choice – the choice of how others will use your personal information. The GDPR enshrines this value into law by repatriating ownership of personal data back to the individual, who may then allow others to hold it in trust. Companies hoping to do business with the EU will need to embrace this idea, as they will be obliged to make good on that trust.