European Union’s General Data Protection Regulation and Lessons for U.S. Privacy Policy

GDPR Threatens Innovation in America and Around the World

Tech - Coworkers - Connected - Thinkstock - web

View Full Document as PDF

The European Union’s (EU) General Data Protection Regulation (GDPR), which enters into effect on May 25, 2018, is the most significant policy change regarding data collection and retention in history, with implications far beyond the EU. It could significantly impact any businesses or organizations in the United States and around the world that have any interaction with EU residents. The GDPR will affect not only Internet companies, but also businesses that collect or process personal data in the offline world. Worse, it will significantly harm competition and innovation not only in Europe, but around the world. 

This will result in greater market concentration, as small firms and startups will find it difficult to comply with the increased regulatory cost burden. When considering new privacy legislation, American lawmakers should consider the GDPR’s many flaws, and strive to avoid them.

The GDPR, which significantly updates the EU’s 1995 Data Protection Directive, aims to cover all aspects of “personal data” by expanding the term’s definition to encompass “any information relating to an identified or identifiable natural person.” This definition of personal data, though simpler than the preceding one it updated, significantly increases the scope of data that is considered personal, encompassing information such as IP addresses and location data. 

The GDPR seeks to respond to the rise of “big data,” a concept that refers to the data sets of high volume, variety, and velocity that have helped fuel technological innovation in the 21st century. Technologies powered by big data, such as artificial intelligence (AI), have sparked concerns regarding privacy, control of user data, and potential monopolization of online commerce. While the GDPR seeks to address some understandable concerns, the regulation is so broad in scope—clocking in at 99 articles over 261 pages—that its various provisions conflict with one another in numerous ways. That is a recipe for unsound policy and regulatory confusion.

In the United States lawmakers from both major parties, most prominently Sens. John Kennedy (R-La.) and Amy Klobuchar (D-Minn.), have expressed interest in the GDPR in the wake of the alleged misappropriation of Facebook user data by the recently defunct political consulting firm Cambridge Analytica, which was based in Great Britain. However, before rushing to adopt GDPR-style regulation in the United States, lawmakers should consider the unintended consequences of such an all-encompassing regulatory regime. This paper discusses three significant implications of the GDPR:

  1. Economic impact, including compliance costs;
  2. Establishment of new, mutually contradictory “digital rights;” and goals; and
  3. Obstruction of innovation.

GDPR’s Economic Impact. The GDPR affects the vast majority of businesses that operate within the European Union, regardless of where they are located. If a business processes the personal data of EU residents, it is subject to the GDPR, even if the data is processed outside the European Union. The GDPR introduces numerous new supposed “rights,” including the:

  • Right of portability, which requires companies to export user data on request in a commonly used machine-readable format; and 
  • Right of erasure, which requires companies to delete a person’s data at his or her request.  

These mandates have led companies around the world, including U.S. firms, to make major changes in how they handle data.

While the GDPR was unveiled in 2016, giving businesses two years to reach full compliance, many businesses remain unprepared just weeks before the regulation goes into effect. In the United Kingdom, for example, 70 percent of companies will likely face potential fines as soon as the GDPR enters into force.

The new data protection framework contains so many new compliance criteria—to be implemented at once—that it prompted the wholesale redesign of many business practices. For example, advertising firms may no longer be able to rely on individualized targeting under the GDPR, which permits such tailoring only if users opt in.

As with many top-down regulations, the GDPR will lead to compliance costs that restrict competition by unduly burdening small and mid-sized businesses across a variety of industries. For example, even email marketing, which many small businesses use to try to reach potential customers, can subject a business to GDPR rules.

When crafting the GDPR, EU regulators apparently did not take into account the particulars of each industry. For example, the automobile industry faces challenges due to the GDPR’s erasure mandate, because a vehicle equipped with the latest technology will typically leave data on far-flung networks as it travels across cities or even countries.

In response to the GDPR’s burdensome requirements, a new compliance industry in data auditing has arisen to help companies navigate through the regulatory morass. Fortune 500 companies are estimated by the International Association of Privacy Professionals (IAPP) and EY (formerly Ernst & Young) to have spent an average of $16 million to comply with the GDPR in the two years leading up to its effective date, while mid-sized firms have spent an average of $550,000. Despite these significant costs, the cost of non-compliance is potentially greater still, with fines of up to €20m ($23.86 million) or 4 percent of a company’s global revenue—whichever is greater.

Most of the GDRP’s compliance costs fall on high tech and financial services firms, which almost invariably process and control user data and are thus required to hire a data protection officer—a position that many companies have failed to fill to date due to a shortage of qualified applicants. Meanwhile, many businesses remain unprepared. Non-tech firms that process or collect personal data are experiencing their own compliance challenges—including determining the extent to which the GDPR applies to them and what they will need to spend to comply with it. 

In addition, changing the terms and conditions governing data collection and use will endanger many popular and beneficial business models. The GDPR requires firms to: 

  • Use simple language; 
  • Describe how they use personal data; 
  • Ensure that users opt in to collection, and 
  • Obtain user consent before permitting any third parties to use their data.

While these requirements may seem reasonable on their face, they are likely to result in a worse online experience for most users. By forcing websites supported by advertising to obtain express consent from users, the GDPR will result in EU subjects being inundated with pop-ups and similar warnings that users often fail to understand. As a result, users will see fewer ads that are relevant to their individual interests, while online services will suffer from reduced ad revenue on a per-user basis.

Under the GDPR, by default, a user must opt in to the collection and use of his or her information by Internet advertising networks, among other companies. While some users may prefer not to receive individualized ads, most users tend to stick with whichever default option is presented to them. In the advertising context, when presented with the option of opting in to persistent tracking—which allows advertisers to tailor ads to users as they browse online—the majority of users are reluctant to grant their consent. This reluctance is at odds with users’ general preference for ad-subsidized services and tailored advertisements.

Similarly, to comply with the GDPR, Facebook will begin requiring EU users to opt-in to the social media platform’s facial recognition feature. In the United States, by contrast, Facebook’s facial recognition feature is enabled by default. Because consumers, when presented with multiple choices, typically stick with the default option, a likely consequence of Facebook’s opt-in regime for EU users will be a reduction in the usefulness of its facial recognition feature for automatically identifying users in photos.

The GDPR imposes a major new liability on data collectors, requiring that they provide a mechanism for authenticating users who want their data erased, along with a mechanism for erasing the data. Although economic relationships that either party may terminate at any time and for any reason are often desirable, such as in the context of at-will employment, other transactions become far less attractive when long-term agreements are forbidden, such as landlord-tenant relationships and many business-to-business service contracts. By depriving users of the freedom to grant irrevocable or long-term consent to companies that process and control their data, the GDPR may actually diminish the value of user data—and, with it, users’ ability to obtain beneficial goods and services in exchange for sharing information about themselves with third parties.

The GDPR’s rules purport to enhance transparency, privacy, and security, but they would result in reduced consumer choice. When users are free to decide to share certain information with companies on an indefinite basis, companies can accumulate, acquire, and sell large data sets for a variety of purposes, from improving ad tailoring to building better models for pricing credit risk. The GDPR, however, would bar consumers from obtaining a valuable service in exchange for letting that company use their data for a fixed period of time. An April 2018 study by RMIT University in Melbourne, Australia, found that these stipulations will alter the economics of data-driven businesses by introducing considerable uncertainty around the value of personal data. Under U.S. law, whether and when a user may revoke consent to the storage or processing of his or her data by a third party generally depends on the terms by which the user originally shared the information. But under the GDPR, a company that collects personal data can only speculate as to how long it will remain free to retain such data.

As American consumers have seen from experience, a focus on notice and choice, rather than prescriptive mandates, has enabled new and innovative companies to form and thrive in the United States much more than in Europe. Although many reasons explain the lackluster track record of high-tech entrepreneurship in Europe as compared to the United States, the EU regulatory regime is among the major factors behind the greater success of Silicon Valley. The previous European Data Protection Directive was more stringent than the U.S. regime, especially with respect to user consent requirements. That caused demonstrable harms to economic dynamism and the technology startup ecosystem. For instance, implementation of past EU data regulations has contributed to a 65 percent reduction in the effectiveness of Internet advertising in the following months. The GDPR will greatly add to these costs.

New “Digital Rights” and their Conflicting Goals. The GDPR introduces new “digital rights,” many of which conflict with one another.

The first significant change is the right to data portability, which is aimed at reducing perceived “lock in” effects online. The idea is that companies that exploit network effects to acquire more data online gain a competitive advantage, which makes it likely for them to remain dominant players. There is little empirical evidence to support this claim, especially in light of the large number of once seemingly dominant online companies that have disappeared. Yet, GDPR advocates argue that data portability will increase competition by reducing whatever network effects exist. 

Allowing users to request all their data in a commonly used machine-readable format comes into conflict with another GDPR value called privacy by design, which calls for privacy to be considered at each stage at which any new product that involves individual data is developed. Demanding that firms furnish each user a portable copy of their data on request creates a major point of vulnerability, especially given the potential for a malicious actor to impersonate a user or infiltrate a user’s devices. To mitigate this privacy risk while complying with the GDPR, many companies will have to spend considerably more on security while redesigning their systems to create a new access point for certain types of user data. Both of these options add significant costs onto companies, and cannot be neatly remedied into a unified framework. As Professor Gus Hurwitz of the Nebraska College of Law has noted, “the easier that [a company] makes it for its users’ data to be exported at scale, the easier [a company] makes it for its users’ data to be exfiltrated at scale.”

The right to erasure, an extension of the EU’s controversial “right to be forgotten,” poses significant challenges to freedom of expression. The right to be forgotten, which was established in the EU prior to the GDPR, has already led to the censorship of truthful speech, especially given a recent ruling that forced Google to remove links to websites that accurately reported on the prior convictions of an EU-based businessman. Since 2014, Google alone has received over 650,000 requests to remove websites from its search results pursuant to the EU’s right to be forgotten. The rule has hampered the ease with which data moves globally. U.S. law does not recognize certain European digital rights, which has led to some U.S. companies being sued for moving EU residents’ data to U.S. facilities. 

As U.S. policy makers consider GDPR-style regulations, they must consider the conflicts such rules pose not only with each other, but with more fundamental constitutional rights.

How the GDPR Obstructs Innovation. GDPR rules risk hampering innovation in important ways beyond the large costs and technological complexities of compliance. Major disruptive technologies—including blockchain, artificial intelligence, and the Internet of things (IoT)—are all at risk of either delayed implementation or of being banned outright due to the measures in the GDPR. More stringent consent requirements will harm startups that need to access data to improve their products and grow their user base. 

The GDPR’s “data minimization” requirement would limit the freedom researchers have long enjoyed to harness large data sets to accomplish technological breakthroughs. Under the GDPR, data minimization allows companies to collect and store data only to the extent that it is necessary to render the services for which the data were obtained, and requires companies to seek additional user approval before using their data in any way not originally specified. However, future beneficial uses of data are often unforeseen at the time of its collection. The freedom to experiment with data, including through serendipitous uses, is crucial to enabling innovation. Restricting collection and experimental use severely impairs innovation. In fact, many of the most successful online companies, such as Facebook, arose out of data experimentation for unrelated purposes. 

Blockchain technology, which has major promising implications for areas from financial services to land registry, faces perhaps the greatest compliance risk under the GDPR. This distributed ledger technology, best known for empowering the global cryptocurrency market, is predicated in large part on its “immutability”—the inability of users to alter items on the blockchain after they have been processed. This security feature conflicts with the right of erasure, because information on the blockchain cannot be destroyed.

The public nature of blockchain, which allows for independent verifiability of transactions, comes into conflict with the GDPR’s principles of privacy by design, given that the information is still potentially identifiable to an individual. How these features of blockchain can be reconciled with the GDPR is yet to be seen. However, given the current global market cap of cryptocurrencies is in excess of $350 billion as of April 19, 2018, there is considerable forgone economic potential if these issues are not resolved.

How the GDPR will affect artificial intelligence is less clear, as the regulations contain no single provision implementing the “right to explanation” for algorithmic decision-making, which holds that individuals affected by machine learning models have the right to understand how the model made its decision. In the field of artificial intelligence, there is a clear tradeoff between a decision’s interpretability and its predictive accuracy. Therefore, the right to explanation reduces the ability to adopt more advanced and accurate algorithms, in favor of less accurate—but more interpretable—ones. This would discourage investment in more innovative AI techniques, as they would not be implementable on the market. In fact, some legal scholars and policy makers have inferred that such a right not only exists, but is potentially enforceable in the future. The UK House of Lords recommended a right to explanation in its recent report on AI. And given that interpreting the GDPR is left to individual EU member states, certain countries will likely adopt this new digital right, granting individuals an entitlement to receive a lay explanation of how an algorithmic decision was made, and the ability to human review of AI-based decisions. 

If the GDPR’s consent requirements are interpreted and enforced strictly, they will harm the adoption of the many Internet of things technologies that depend on linking disparate data sets. Because IoT technology often requires linking different activities, new consent rules would require constant user verification of changes made, reducing the total accessible data set. In the UK, where consent requirements have been interpreted the strongest, the economics consultancy London Economics found that the requirements have caused firms to “move data collection and analysis in-house rather than outsourcing it to specialist analytics and data providers, thereby undoing the benefits of specialization and entrenching the market power of larger firms.” This presents increased barriers to market entry by startups and reduces the incentive to innovate.

Conclusion. The General Data Protection Regulation is the most significant change to the legal treatment of data in history, with implications that have already spilled well-beyond the European Union’s borders. The rapid demand for implementing these sweeping policy changes has left companies unprepared for a complex set of legal and technological challenges. The economic effects will include greater market concentration, as small firms and startups struggle to comply, conflicting priorities for businesses, greater inconvenience for users, and reduced innovation. As the United States considers new privacy legislation, lawmakers should learn from the GDPR’s many flaws, rather than adopting it wholesale.