Can We Really Forget?

I expected this post would turn out differently.

I had intended to commend the European Court of Justice for placing sensible limits on the extraterritorial enforcement of the EU’s Right to be Forgotten. They did, albeit in a limited way,[1] and it was a good decision. There.  I did it. In 154 words.

Now for the remaining 1400 or so words.

But reading the decision pushes me back into frustration at the entire Right to be Forgotten regime and its illogical and destructive basis. The fact that a court recognizes the clear fact that the EU cannot (generally) force foreign companies to violate the laws of their own countries in internet sites that are intended for use within those countries (and NOT the EU), does not come close to offsetting the logical, practical and societal problems with the way the EU perceives and enforces the Right to be Forgotten.

As a lawyer, with all decisions grounded in the U.S. Constitution, I am comfortable with the First Amendment’s protection of Freedom of Speech – that nearly any truthful utterance or publication is inviolate, and that the foundation of our political and social system depends on open exposure of facts to sunlight. Intentionally shoving those true facts into the dark is wrong in our system and openness will be protected by U.S. courts.

Believe it or not, the European Union also has such a concept at the core of its foundation too. Article 10 of the European Convention on Human Rights states that:

“Everyone has the right to freedom of expression. This right shall include freedom to hold opinions and to receive and impart information and ideas without interference by public authority and regardless of frontiers.”

So we have the same values, right? In both jurisdictions the right to impart information can be exercised without interference by public authority.  Not so fast.  The EU contains a litany of restrictions on this right, including a limitation of your right to free speech by the policy to protect the reputation of others.

This seems like a complete evisceration of a right to open communication if a court can force obfuscation of facts just to protect someone’s reputation.  Does this person deserve a bad reputation? Has he or she committed a crime, failed to pay his or her debts, harmed animals or children, stalked an ex-lover, or violated an oath of office, marriage, priesthood or citizenship? It doesn’t much matter in the EU. The right of that person to hide his/her bad or dangerous behavior outweighs both the allegedly fundamental right to freedom to impart true information AND the public’s right to protect itself from someone who has proven himself/herself to be a risk to the community.

So how does this tension play out over the internet? In the EU, it is law that Google and other search engines must remove links to true facts about any wrongdoer who feels his/her reputation may be tarnished by the discovery of the truth about that person’s behavior. Get into a bar fight?  Don’t worry, the EU will put the entire force of law behind your request to wipe that off your record. Stiff your painting contractors for tens of thousands of Euros despite their good performance? Don’t worry, the EU will make sure nobody can find out . Get fired, removed from office or defrocked for dishonesty? Don’t worry, the EU has your back.

And that undercutting of speech rights has now been codified in Article 17 of Regulation 2016/679, the Right to be Forgotten.

And how does this new decision affect the rule? In the past couple weeks, the Grand Chamber of the EU Court of Justice issued an opinion limiting the extraterritorial reach of the Right to be Forgotten. (Google vs CNIL, Case C‑507/17) The decision confirms that search engines must remove links to certain embarrassing instances of true reporting, but must only do so on the versions of the search engine that are intentionally servicing the EU, and not necessarily in versions of the search engines for non-EU jurisdictions.

The problems with appointing Google to be an extrajudicial magistrate enforcing vague EU-granted rights under a highly ambiguous set of standards and then fining them when you don’t like a decision you forced them to make, deserve a separate post.

Why did we even need this decision? Because the French data privacy protection agency, known as CNIL, fined Google for not removing presumably true data from non-EU search results concerning, as Reuters described, “a satirical photomontage of a female politician, an article referring to someone as a public relations officer of the Church of Scientology, the placing under investigation of a male politician and the conviction of someone for sexual assaults against minors.”  So, to be clear, while the official French agency believes it should enforce a right for people to obscure that they have been convicted of sexual assault against children from the whole world, the Grand Chamber of the European Court of Justice believes that the people convicted child sexual assault should be protected in their right to obscure these facts only from people in Europe. This is progress.

Of course, in the U.S., politicians and other public figures, under investigation or subject to satire or people convicted of sexual assault against children do not have a right to protect their reputations by forcing Google to remove links to public records or stories in news outlets. We believe both that society is better when facts are allowed to be reported and disseminated and that society is protected by reporting on formal allegations against public figures or criminal convictions of private ones.

I am glad that the EU Court of Justice is willing to restrict rules to remain within its jurisdiction where they openly conflict with the basic laws of other jurisdictions. The Court sensibly held,

“The idea of worldwide de-referencing may seem appealing on the ground that it is radical, clear, simple and effective. Nonetheless, I do not find that solution convincing, because it takes into account only one side of the coin, namely the protection of a private person’s data.[2] . . . [T]he operator of a search engine is not required, when granting a request for de-referencing, to operate that de-referencing on all the domain names of its search engine in such a way that the links at issue no longer appear, regardless of the place from which the search on the basis of the requester’s name is carried out.”

Any other decision would be wildly overreaching. Believe me, every country in the EU would be howling in protest if the US decided that its views of personal privacy must be enforced in Europe by European companies due to operations aimed only to affect Europe. It should work both ways. So this was a well-reasoned limitation.

But I just cannot bring myself to be complimentary of a regime that I find so repugnant – where nearly any bad action can be swept under the rug in the name of protecting a person’s reputation.

As I have written in books and articles in the past, government protection of personal privacy is crucial for the clean and correct operation of a democracy.  However, privacy is also the obvious refuge of scoundrels – people prefer to keep the bad things they do private. Who wouldn’t? But one can go overboard protecting this right, and it feels like the EU has institutionalized its leap overboard.

I would rather err on the side of sunshine, giving up some privacy in the service of revealing the truth, than err on the side of darkness, allowing bad deeds to be obscured so that those who commit them can maintain their reputations.  Clearly, the EU doesn’t agree with me.


[1] The Court, in this case, wrote, “The issues at stake therefore do not require that the provisions of Directive 95/46 be applied outside the territory of the European Union. That does not mean, however, that EU law can never require a search engine such as Google to take action at worldwide level. I do not exclude the possibility that there may be situations in which the interest of the European Union requires the application of the provisions of Directive 95/46 beyond the territory of the European Union; but in a situation such as that of the present case, there is no reason to apply the provisions of Directive 95/46 in such a way.”

[2] EU Court of Justice case C-136/17, which states, “While the data subject’s rights [to privacy] override, as a general rule, the freedom of information of internet users, that balance may, however, depend, in specific cases, on the nature of the information in question and its sensitivity for the data subject’s private life and on the interest of the public in having that information. . . .”

 


Copyright © 2019 Womble Bond Dickinson (US) LLP All Rights Reserved.

For more EU’s GDPR enforcement, see the National Law Review Communications, Media & Internet law page.

Google Fined $57 Million in First Major Enforcement of GDPR Against a US-based Company

On January 21, 2019, Google was fined nearly $57 million (approximately 50 million euros) by France’s Data Protection Authority, CNIL, for an alleged violation of the General Data Protection Regulation (GDPR).[1] CNIL found Google violated the GDPR based on a lack of transparency, inadequate information, and lack of valid consent regarding ad personalization. This fine is the largest imposed under the GDPR since it went into effect in May 2018 and the first to be imposed on a U.S.-based company.

CNIL began investigating Google’s practices based on complaints received from two GDPR consumer privacy rights organizations alleging Google did not have a valid legal basis to process the personal data of the users of its services, particularly for Google’s personalized advertisement purposes. The first of the complaints was filed on May 25, 2018, the effective date of the GDPR.

Following its investigation, CNIL found the general structure of the information required to be disclosed by Google relating to its processing of users’ information was “excessively disseminated across several documents.” CNIL stated the relevant information pertaining to privacy rights was only available after several steps, which sometimes required up to five or six actions. Moreover, CNIL indicated users were not able to fully understand the extent of the processing operations carried out by Google because the operations were described in a “too generic and vague manner.” Additionally, the regulator determined information regarding the retention period was not provided for some data collected by Google.

Google’s process for obtaining user consent to data collection for advertisement personalization was also alleged to be problematic under the GDPR. CNIL stated Google users’ consent was not considered to be sufficiently informed due to the information on processing operations for advertisement being spread across several documents. The consent obtained by Google was not deemed to be specific to any individual Google service, and CNIL determined it was impossible for the user to be aware of the extent of the data processed and combined.

Finally, CNIL determined the user consent captured by Google was not “specific” or “unambiguous” as these terms are defined by the GDPR. By way of example, CNIL noted that Google’s users were asked to click the boxes «I agree to Google’s Terms of Service» and «I agree to the processing of my information as described above and further explained in the Privacy Policy» in order to create the account. As a result, the user was required to give consent, in full, for all processing operations purposes carried out by Google based on this consent, rather than for distinct purposes, as required under the GDPR. Additionally, the CNIL commented Google’s checkbox used to capture user consent relating to ad personalization was “pre-clicked.” The GDPR requires consent to be “unambiguous,” with clear affirmative action from the user, which according to the CNIL, required clicking an unclicked box.

This fine may be appealed by Google, which indicated it remained committed to meeting the “high standards of transparency and control” expected by its users and to complying with the consent requirements of the GDPR. Google indicated it would study the decision to determine next steps. Given Google is the first U.S.-based company against whom a DPA has attempted GDPR enforcement, in combination with the size of the fine imposed, it will be interesting to watch how Google responds.

The GDPR enforcement action against Google should be seen as a message to all U.S.-based organizations that collect the data of citizens of the European Union. Companies should review their privacy policies, practices, and end-user agreements to ensure they are compliant with the consent requirements of the GDPR.


© 2019 Dinsmore & Shohl LLP. All rights reserved.
This post was written by Matthew S. Arend and Jared M. Bruce of Dinsmore & Shohl LLP.