Privacy Tip #359 – GoodRx Settles with FTC for Sharing Health Information for Advertising

The Federal Trade Commission (FTC) announced on February 1, 2023 that it has settled, for $1.5M, its first enforcement action under its Health Breach Notification Rule against GoodRx Holdings, Inc., a telehealth and prescription drug provider.

According to the press release, the FTC alleged that GoodRx failed “to notify consumers and others of its unauthorized disclosures of consumers’ personal health information to Facebook, Google, and other companies.”

In the proposed federal court order (the Order), GoodRx will be “prohibited from sharing user health data with applicable third parties for advertising purposes.” The complaint alleged that GoodRx told consumers that it would not share personal health information, and it monetized users’ personal health information by sharing consumers’ information with third parties such as Facebook and Instagram to help target users with ads for personalized health and medication-specific ads.

The complaint also alleged that GoodRx “compiled lists of its users who had purchased particular medications such as those used to treat heart disease and blood pressure, and uploaded their email addresses, phone numbers, and mobile advertising IDs to Facebook so it could identify their profiles. GoodRx then used that information to target these users with health-related advertisements.” It also alleges that those third parties then used the information received from GoodRx for their own internal purposes to improve the effectiveness of the advertising.

The proposed Order must be approved by a federal court before it can take effect. To address the FTC’s allegations, the Order prohibits the sharing of health data for ads; requires user consent for any other sharing; stipulates that the company must direct third parties to delete consumer health data; limits the retention of data; and implement a mandated privacy program. Click here to read the press release.

Copyright © 2023 Robinson & Cole LLP. All rights reserved.

Italian Garante Bans Google Analytics

On June 23, 2022, Italy’s data protection authority (the “Garante”) determined that a website’s use of the audience measurement tool Google Analytics is not compliant with the EU General Data Protection Regulation (“GDPR”), as the tool transfers personal data to the United States, which does not offer an adequate level of data protection. In making this determination, the Garante joins other EU data protection authorities, including the French and Austrian regulators, that also have found use of the tool to be unlawful.

The Garante determined that websites using Google Analytics collected via cookies personal data including user interactions with the website, pages visited, browser information, operating system, screen resolution, selected language, date and time of page views and user device IP address. This information was transferred to the United States without the additional safeguards for personal data required under the GDPR following the Schrems II determination, and therefore faced the possibility of governmental access. In the Garante’s ruling, website operator Caffeina Media S.r.l. was ordered to bring its processing into compliance with the GDPR within 90 days, but the ruling has wider implications as the Garante commented that it had received many “alerts and queries” relating to Google Analytics. It also stated that it called upon “all controllers to verify that the use of cookies and other tracking tools on their websites is compliant with data protection law; this applies in particular to Google Analytics and similar services.”

Copyright © 2022, Hunton Andrews Kurth LLP. All Rights Reserved.

Google to Launch Google Analytics 4 in an Attempt to Address EU Privacy Concerns

On March 16, 2022, Google announced the launch of its new analytics solution, “Google Analytics 4.” Google Analytics 4 aims, among other things, to address recent developments in the EU regarding the use of analytics cookies and data transfers resulting from such use.

Background

On August 17, 2020, the non-governmental organization None of Your Business (“NOYB”) filed 101 identical complaints with 30 European Economic Area data protection authorities (“DPAs”) regarding the use of Google Analytics by various companies. The complaints focused on whether the transfer of EU personal data to Google in the U.S. through the use of cookies is permitted under the EU General Data Protection Regulation (“GDPR”), following the Schrems II judgment of the Court of Justice of the European Union. Following these complaints, the French and Austrian DPAs ruled that the transfer of EU personal data from the EU to the U.S. through the use of the Google Analytics cookie is unlawful.

Google’s New Solution

According to Google’s press release, Google Analytics 4 “is designed with privacy at its core to provide a better experience for both our customers and their users. It helps businesses meet evolving needs and user expectations, with more comprehensive and granular controls for data collection and usage.”

The most impactful change from an EU privacy standpoint is that Google Analytics 4 will no longer store IP address, thereby limiting the data transfers resulting from the use of Google Analytics that were under scrutiny in the EU following the Schrems II ruling. It remains to be seen whether this change will ease EU DPAs’ concerns about Google Analytics’ compliance with the GDPR.

Google’s previous analytics solution, Universal Analytics, will no longer be available beginning July 2023. In the meantime, companies are encouraged to transition to Google Analytics 4.

Read Google’s press release.

Copyright © 2022, Hunton Andrews Kurth LLP. All Rights Reserved.

CNIL Fines Google and Amazon 135 Million Euros for Alleged Cookie Violations

On December 10, 2020, the French Data Protection Authority (the “CNIL”) announced that it has levied fines of €60 million on Google LLC and €40 million on Google Ireland Limited under the French cookie rules for their alleged failure to (1) obtain the consent of users of the French version of Google’s search engine (google.fr) before setting advertising cookies on their devices; (2) provide users with adequate information about the use of cookies; and (3) implement a fully effective opt-out mechanism to enable users to refuse cookies. On the same date, the CNIL announced that it has levied a fine of €35 million on Amazon Europe Core under the same rules for its alleged failure to (1) obtain the consent of users of the amazon.fr site before setting advertising cookies on their devices; and (2) provide adequate information about the use of cookies.

Background

The French cookie rules are laid down in (1) Article 82 of the French Data Protection Act, which implements into French law the provisions of the EU ePrivacy Directive governing the use of cookies; and (2) soft law instruments aimed at guiding operators in implementing Article 82 of the French Data Protection Act in practice.

While the provisions of Article 82 of the French Data Protection Act have remained unchanged, the CNIL revised its soft law instruments to take into account the strengthened consent requirements of the EU General Data Protection Regulation (“GDPR”). On July 18, 2019, the CNIL published new guidelines on the use of cookies and similar technologies (the “Guidelines”). The Guidelines repealed the CNIL’s 2013 cookie recommendations that were no longer valid in light of the GDPR’s consent requirements. The Guidelines were to be complemented by recommendations on the practical modalities for obtaining users’ consent to set or read non-essential cookies and similar technologies on their devices (the “Recommendations”). On October 1, 2020, the CNIL published a revised version of its Guidelines and its final Recommendations. The CNIL announced that it would allow for a transition period of six months to comply with the new cookie law rules (i.e., until the end of March 2021), and that it would carry out inspections to enforce the new rules after that transition period. However, the CNIL made clear that it reserves the right to take action against certain infringements, especially in cases of particularly serious infringements of the right to privacy. In addition, the CNIL announced that it would continue to investigate infringements of the previous cookie law rules.

Against that background, on December 2019, March 6 and May 19, 2020, the CNIL carried out three remote inspections of the amazon.fr website and an onsite inspection at the premises of the French establishment of the Amazon group, Amazon Online France SAS. On March 16, 2020, the CNIL also carried out a remote inspection of the google.fr site. These inspections aimed to verify whether Google LLC and Google Ireland Limited and Amazon Europe Core complied with the French Data Protection Act, and in particular with its Article 82, when setting or reading non-essential cookies on the devices of users living in France who visited google.fr and amazon.fr websites. In its press releases, the CNIL stressed that its sanctions against Google and Amazon punished breaches of obligations that existed before the GDPR and are not part of the obligations clarified by the new Guidelines and Recommendations.

CNIL’s Jurisdiction Over Google Ireland Limited’s and Amazon Europe Core’s Cookie Practices

Google and Amazon challenged the jurisdiction of the CNIL arguing that (1) the cooperation mechanism of the GDPR (known as the one-stop-shop mechanism) should apply and the CNIL is not their lead supervisory authority for the purposes of that mechanism; and (2) their cookie practices do not fall within the territorial scope of the French Data Protection Act. Pursuant to Article 3 of the French Data Protection Act, it applies to the processing of personal data carried out in the context of the activities of an establishment of a data controller (or data processor) in France. In that respect, Amazon argued that its French establishment was not involved in the setting of cookies on the amazon.fr site and that there is no inextricable link between the activities of the French establishment and the setting of cookies by Amazon Europe Core, the Luxembourg affiliate of the Amazon group, responsible for the European Amazon websites, including the French site. Google argued that, because the one-stop-shop mechanism should apply, its Irish affiliate, Google Ireland Limited, is the actual headquarters of the Google group in Europe and thus its main establishment for the purposes of the one-stop-shop mechanism. Accordingly, the Irish Data Protection Commissioner would be the only competent supervisory authority.

Inapplicability of the One-Stop-Shop Mechanism of the GDPR

In the initial version of its Guidelines, the CNIL made clear that it may take any corrective measures and sanctions under Article 82 of the French Data Protection Act, independently of the GDPR’s cooperation and consistency mechanisms, because the French cookie rules are based on the EU ePrivacy Directive and not the GDPR. Unsurprisingly, therefore, the CNIL rejected the arguments of Google and Amazon, considering that the EU ePrivacy Directive provides for its own mechanism, designed to implement and control its application. Accordingly, the CNIL concluded that the one-stop-shop mechanism of the GDPR does not apply to the enforcement of the provisions of the EU ePrivacy Directive, as implemented under French law.

To prevent such a situation in the future and ensure consistent interpretation and enforcement of both sets of rules, the European Data Protection Board (the “EDPB”) has called for the GDPR’s cooperation and consistency mechanism to be used for the supervision of the future cookie rules under the ePrivacy Regulation, which will replace the ePrivacy Directive. The CNIL did not wish to pre-empt this future development, and applied the relevant texts literally in its cases against Google and Amazon.

CNIL’s Territorial Jurisdiction

 The CNIL, citing the rulings of the Court of Justice of the European Union in the Google Spain and Wirtschaftsakademie cases, took the view that the use of cookies on the French site (google.fr and amazon.fr respectively) was carried out in the context of the activities of the French establishment of the companies, because that establishment promotes their respective products and services in France.

Controllership Status of Google LLC

Following his investigation, the Rapporteur of the CNIL considered that Google Ireland Limited and Google LLC are joint controllers in respect of the processing consisting in accessing or storing information on the device of Google Search users living in France.

Google argued that Google Ireland Limited is solely responsible for those operations and that Google LLC is a processor. Google stressed that (1) its Irish affiliate participates in the various decision-making bodies and in the different stages of the decision-making process implemented by the group to define the characteristics of the cookies set on Google Search; and (2) differences exist between the cookies set on European users’ devices and those set on the devices of other users (e.g., shorter retention periods, no personalized ads served to children within the meaning of the GDPR, etc.), which demonstrate the decision-making autonomy of Google Ireland Limited.

In its decision, the CNIL found that Google LLC is also represented in the bodies that adopt decisions relating to the deployment of Google products within the European Economic Area and in Switzerland, and to the processing of personal data of users living in those regions. The CNIL also found that Google LLC exercises a decisive influence in those decision-making bodies. The CNIL further found that the differences in the cookie practices were just differences in implementation, mainly intended to comply with EU law. According to the CNIL, those differences do not affect the global advertising purpose for which the cookies are used. In the CNIL’s view, this purpose is also determined by Google LLC, and the differences invoked by Google are not sufficient to demonstrate the decision-making autonomy of Google Ireland Limited. In addition, the CNIL found that Google LLC also participates in the determination of the means of processing since Google LLC designs and builds the technology of cookies set on the European users’ devices. The CNIL concluded that Google LLC and Google Ireland Limited are joint controllers.

Cookie Violations

Setting of advertising cookies without obtaining the user’s prior consent

The CNIL’s inspection of the google.fr website revealed that, when users visited that site, seven cookies were automatically set on their device. Four of these cookies were advertising cookies.

In the case of Amazon, the investigation revealed that, whenever users first visited the home page of the amazon.fr website or visited the site after they clicked on an ad published on another site, more than 40 advertising cookies were automatically set on their device.

Since advertising cookies require users’ prior consent, the CNIL concluded that the companies failed to comply with the cookie consent requirement of Article 82 of the French Data Protection Act.

Lack of adequate information provided to users

When the CNIL inspected the google.fr website, the CNIL found that an information banner was displayed at the bottom of the page, with the following note: “Privacy reminder from Google,” and two buttons: “Remind me latter” and “Access now.” According to the CNIL, the banner did not provide users with information regarding the cookies that were already set on their device. Further, that information was also not immediately provided when users clicked on the “Access now” button. Google amended its cookie practices in September 2020. However, the CNIL found that the new pop-up window does not provide clear and complete information to users under Article 82 of the French Data Protection Act. In the CNIL’s view, the new pop-up window does not inform users of all the purposes of the cookies and the means available to them to refuse cookies. In particular, the CNIL found that the information provided to users does not enable them to understand the type of content and ads that may be personalized based on their behavior (e.g., whether this is geolocation-based advertising), the precise nature of the Google services that use personalization, and whether this personalization is carried out across different services. Further, the CNIL found that the terms “options” or “See more” in the new window are not explicit enough to enable users to understand how they can refuse cookies.

When inspecting the amazon.fr website, the CNIL found that the information provided to users was neither clear, nor complete. The cookie banner displayed on the site provided a general and approximate description of the purposes of the cookies (“to offer and improve our services”). Further, according to the CNIL, the “Read more” link included in the banner did not explain to users that they could refuse cookies, nor how to do so. The CNIL found that Amazon Europe Core’s failure to provide adequate information was even more obvious in the case of users visiting the site after they had clicked on an ad published on another site. In this case, no information was provided to them.

Opt-out mechanism partially defective

In the case of Google, the CNIL also found that, when a user deactivated the ad personalization on Google Search by using the mechanism available from the “Access now” button, one of the advertising cookies was still stored on the user’s device and kept reading information destined for the server to which the cookie was attached. The CNIL concluded that the opt-out mechanism was partially defective.

CNIL’s Sanctions

In setting the fines in both cases, the CNIL took into account the seriousness of the breaches of Article 82 of the French Data Protection Act, the high number of users affected by those breaches, and the financial benefits deriving from the advertising income indirectly generated from the data collected by the advertising cookies. Interestingly, in the case of Google, the CNIL cited a decision of the French Competition Authority and referred to Google’s dominant position in the online search market.

In both cases, the CNIL noted that the companies amended their cookie practices in September 2020 and stopped automatically setting advertising cookies. However, the CNIL found that the new information provided to users is still not adequate. Accordingly, the CNIL ordered the three companies to provide adequate information to users about the use of cookies on their respective sites. The CNIL also ordered a periodic penalty payment of €100,000 (i.e., the maximum amount permitted under the French Data Protection Act) for each day of delay in complying with the injunction, commencing three months following notification of the CNIL’s decision in each case.

The CNIL addressed its decisions to the French establishment of the companies in order to enforce these decisions. The companies have four months to appeal the respective decision before France’s highest administrative court (Conseil d’Etat).

Read the CNIL’s decision against Google LLC and Google Ireland Limited and the CNIL’s decision against Amazon Europe Core (currently available only in French).

Copyright © 2020, Hunton Andrews Kurth LLP. All Rights Reserved.

 

ARTICLE BY Hunton Andrews Kurth’s Privacy and Cybersecurity

 

For more articles on Google, visit the National Law Review Communications, Media & Internet section.

New U.K. Competition Unit to Focus on Facebook and Google, and Protecting News Publishers

You know your company has tremendous market power when an agency is created just to watch you.

That’s practically what has happened in the U.K. where the Competition and Markets Authority (CMA) has increased oversight of ad-driven digital platforms, namely Facebook and Google, by establishing a dedicated Digital Markets Unit (DMU). While it was created to enforce new laws to govern any platform that dominates their respective market, when the new unit starts operating in April 2021 Facebook and Google will get its full attention.

The CMA says the intention of the unit is to “give consumers more choice and control over their data, help small businesses thrive, and ensure news outlets are not forced out by their bigger rivals.” While acknowledging the “huge benefits” these platforms offer businesses and society, helping people stay in touch and share creative content, and helping companies advertise their services, the CMA noted the growing concern that the concentration of market power among so few companies is hurting growth in the tech sector, reducing innovation and “potentially” having negative effects on their individual and business customers.

The CMA said a new code and the DMU will help ensure that the platforms are not forcing unfair terms on businesses, specifically mentioning “news publishers” and the goal of “helping enhance the sustainability of high-quality online journalism and news publishing.”

The unit will have the power to suspend, block and reverse the companies’ decisions, order them to comply with the law, and fine them.

The devil will be in the details of what the new code will require, and questions remain about what specific conduct the DMU will target and what actions it will take. Will it require the companies to pay license fees to publishers for presenting previews of their content? Will the unit reduce the user data the companies may access, something that would threaten their ad revenue? Will Facebook and Google have to share data with competitors? We will learn more when the code is drafted and when the DMU begins work in April.

Once again a European nation has taken the lead on the global stage to control the downsides of technologies and platforms that have transformed how people communicate and get their news, and how companies reach them to promote their products. With the U.S. deadlocked on so many policy matters, change in the U.S. appears most likely to come as the result of litigation, such as the Department of Justice’s suit against Google, the FTC’s anticipated suit against Facebook, and private antitrust actions brought by companies and individuals.

Edited by Tom Hagy for MoginRubin LLP.

© MoginRubin LLP

ARTICLE BY Mogin Rubin
For more articles on Google, visit the National Law Review  Communications, Media & Internet section,

California Court of Appeal Rules that Challenge to Google’s Confidentiality Agreements May Proceed Past the Pleading Stage

On September 21, 2020, in a published 2-1 opinion in Doe v. Google Inc., the California Court of Appeal (Dist. 1, Div. 4), permitted three current and former Google employees to proceed with their challenge of Google’s confidentiality agreement as unlawfully overbroad and anti-competitive under the California Private Attorneys General Act (“PAGA”) (Lab. Code § 2698 et seq.).  In doing so, the Court of Appeal reversed the trial court’s order sustaining Google’s demurrer on the basis of preemption by the National Labor Relations Act (“NLRA”) (29 U.S.C. § 151 et seq.) under San Diego Bldg. Trades Council v. Garmon359 U.S. 236, 244–245 (1959).  The court held that while the plaintiffs’ claims relate to conduct arguably within the scope of the NLRA, they fall within the local interest exception to Garmon preemption and may therefore go forward.  It remains to be seen whether plaintiffs will be able to sustain their challenges to Google’s confidentiality policies on the merits.  However, Doe serves as a reminder to employers to carefully craft robust confidentiality agreements, particularly in the technology sector, in anticipation of potential challenges employees may make to those agreements.

Google requires its employees to sign various confidentiality policies.  The plaintiffs brought a lawsuit challenging these policies on the basis that they restricted their speech in violation of California law.  Specifically, the plaintiffs alleged 17 claims that fell into three subcategories based on Google’s confidentiality policies: restraints of competition, whistleblowing and freedom of speech.  The claims were brought under PAGA, a broad California law that provides a private right of action to “aggrieved employees” for any violation of the California Labor Code.  PAGA claims are brought on a representative basis—with the named plaintiffs deputized as private attorneys general—to recover penalties on behalf of all so-called “aggrieved employees,” typically state-wide, with 75% of such penalties being paid to the State and 25% to the “aggrieved employees” if the violation is proven (or a court-approved settlement is reached).

In their competition causes of action plaintiffs alleged that Google’s confidentiality rules violated Business & Professions Code sections 17200, 16600, and 16700 as well as various Labor Code provisions by preventing employees from using or disclosing the skills, knowledge, and experience they obtained at Google for purposes of competing with Google.  The court noted that section 16600 “evinces a settled legislative policy in favor of open competition and employee mobility” that has been “instrumental in the success of California’s technology industry.”  The plaintiffs complained that Google’s policies prevented them from negotiating a new job with another employer, disclosing who else works at Google, and under what circumstances the employee may be receptive to an offer from a rival employer.

With respect to their whistleblowing claims, the plaintiffs alleged that Google’s confidentiality rules prevent employees from disclosing violations of state and federal law, either within Google to their managers or outside Google to private attorneys or government officials in violation of Business & Professions Code section 17200 et seq. and Labor Code section 1102.5.  Similarly, it is alleged that the policies ostensibly prevented employees from disclosing information about unsafe or discriminatory working conditions, a right afforded to them under the Labor Code.

In their freedom of speech claims, plaintiffs alleged that Google’s confidentiality rules prevent employees from engaging in lawful conduct during non-work hours and violate state statutes entitling employees to disclose wages, working conditions, and illegal conduct under various Labor Code provisions.  The employees argued this conduct could be writing a novel about working in Silicon Valley or to even reassure their parents they are making enough money to pay their bills—i.e., matters seemingly untethered to a legitimate need for confidentiality.

While Google’s confidentiality rules contain a savings clause—confirming Google’s rules were not meant to prohibit protected activity—the plaintiffs argued that the clauses were meaningless and not implemented in its enforcement of its confidentiality agreements.

Google demurred to the entire complaint, and the trial court sustained the demurrer as to plaintiffs’ confidentiality claims, agreeing that the NLRA preempted such claims.

On appeal, the Court of Appeal recognized that the NLRA serves as a “comprehensive law governing labor relations [and] accordingly, ‘the NLRB has exclusive jurisdiction over disputes involving unfair labor practices, and “state jurisdiction must yield’ when state action would regulate conduct governed by the NLRA.  (Garmon, [supra, 359 U.S.] at pp. 244-245.)”  But the court cautioned that NLRA preemption under Garmon cannot be applied in a “mechanical fashion,” and its application requires scrutiny into whether the activity in questions is a “merely peripheral concern” of the NLRA or where the “regulated conduct touche[s] interests so deeply rooted” in state and local interests.

In analyzing the federal and state issues at state, the Court of Appeal found that several of the statutes undergirding plaintiffs’ PAGA claims did not sound in principles of “mutual benefit” that are the foundation of the NLRA but protected the plaintiff’s activities as individuals.  The court cited several examples, including Labor Code section 242 prohibition of employers preventing employees from disclosing the amount of his or her wages (a statute enacted to prevent sex discrimination) and Labor Code section 232.5, prohibiting an employee from disclosing information about the employer’s working conditions (manifesting California’s policy to prohibit restrictions on speech regarding conditions of employment).  The court likewise found that the NLRA did not protect much of the activity prohibited by the statutes that supported plaintiffs’ PAGA claims, noting that the NLRA did not prohibit rules inhibiting employees from seeking new employment and competing with Google, as plaintiffs alleged Google’s confidentiality rules did.  It further does not protect whistleblowing activity unconnected to working conditions, such as violations of securities law, false claims laws, and other laws unrelated to terms and conditions of employment.

Nevertheless, the court held that, regardless of diverging purposes of the NLRA and the laws that support the plaintiffs’ PAGA claims, plaintiffs’ claims fall squarely in the local interest exception to NLRA preemption.  Where an employer’s policies are arguably prohibited by the NLRA, the local interest exception to NLRA preemption applies when (1) there is a “significant state interest” in protecting the citizen from the challenged conduct, and (2) the exercise of state jurisdiction entails “little risk of interference” with the NLRB’s regulatory function.  The court found no difficulty in determining that an action under PAGA, where the plaintiffs are serving as a “proxy or agent of the state’s labor law enforcement agencies” grows from “deeply-rooted local interests” in regulating wages, hours, and other terms of employment.  It also found that a state’s enforcement of its minimum employment standards, particularly in relation to the plaintiffs claims in this case, were peripheral to the NLRA’s purpose of safeguarding, first and foremost, workers’ rights to join unions and engage in collective bargaining.  Thus, the court held, there was no basis for NLRA preemption in this case.

Particularly in light of this opinion, employers who require employees to execute confidentiality agreements with their employees should be cognizant of the myriad of ways that they can be challenged.  As in the case of Doe v. Google, Inc., such challenges may not be just from individuals bringing claims in their own capacity, but as private attorneys general bringing representative claims on behalf of all California employees.  Nor can NLRA preemption be mechanically applied to preempt claims based upon such agreements.  Employers would be well-advised to review their existing confidentiality agreements and consult experienced counsel before revising or rolling out such agreements.


Copyright © 2020, Sheppard Mullin Richter & Hampton LLP.
For more articles on labor law, visit the National Law Review Labor & Employment section.

Youtube May Be an Enormous Town Square, But It’s Still Not Subject to the First Amendment

In Prager University v. Google LLC, et al., Case No. 18-15712 (9th Cir. Feb. 26, 2020), the Court of Appeals for the Ninth Circuit dismissed a First Amendment lawsuit against YouTube late last week, holding that the video hosting giant is a private forum that is free to foster particular viewpoints – and need not be content-neutral.  The victory is a significant message to other online content hosts, aggregators and service providers that they need not feel threatened by censorship claims for selecting and curating content on their systems.

The lawsuit began in 2017, when conservative media company PragerU sued YouTube for imposing restrictions on some of PragerU’s short animated educational videos.  YouTube tagged several dozen videos for age-restrictions and disabled third party advertisements on others.  PragerU claimed the restrictions constituted censorship because they muted conservative political viewpoints.

Traditionally, the First Amendment regulates only U.S. and state government actors when it comes to censoring speech; it does not touch the actions of private parties.  The Ninth Circuit noted that these principles have not “lost their vitality in the digital age.”  While this threshold question is not new, PragerU’s approach to this legal hurdle has drawn fresh interest in how courts’ conception of state action might one day shift in order to accommodate the digital re-imagining of a marketplace of ideas.

PragerU argued that YouTube should be treated as something akin to a government where it operates a “public forum for speech.”  The theory follows that because YouTube has an overwhelming share of the video sharing and streaming space, it essentially performs a “public function.”  The Ninth Circuit affirmed that public use of private resources, even on a large scale, is simply not governmental.  Just because YouTube generally invites the public to use its private property (in this case, its platform) for a specific or designated purpose, does not mean that property should lose its private character.  Similarly, the Ninth Circuit ruled almost twenty years ago that internet service provider America Online was not a government actor even though it broadly opened its networks to the public to send and receive speech.

PragerU’s theory does enjoy some support.  As the Ninth Circuit acknowledged, a private actor is a state or government entity for First Amendment purposes when it performs a public function that is “traditionally and exclusively governmental.”  In other words, the First Amendment may well still apply to private companies tasked with operating public elections or even local governmental administrative duties (for example, the proverbial “company town”).  But the Ninth Circuit simply did not accept the argument that YouTube’s function of “hosting speech on a private platform” bore any resemblance to “an activity that only governmental entities” traditionally and exclusively perform.  After all, noted the Court, even “grocery stores and comedy clubs have opened their property for speech.”  Neither was the Court persuaded that the sheer scale of YouTube’s operation – equal to perhaps many millions of grocery stores and comedy clubs – should alter the analysis.

Had the Ninth Circuit adopted PragerU’s approach, it would have been the first major judicial endorsement of the view that a private entity can convert into a public one solely where its property is opened up to significant public discourse.  Instead, the Ninth Circuit imposed and upheld a more traditional delineation between public and private actors in First Amendment jurisprudence.


© 2020 Mitchell Silberberg & Knupp LLP

See the National Law Review for more on constitutional law questions.

Hackers Eavesdrop and Obtain Sensitive Data of Users Through Home Smart Assistants

Although Amazon and Google respond to reports of vulnerabilities in popular home smart assistants Alexa and Google Home, hackers continually work hard to exploit any vulnerabilities to be able to listen to users’ every word to obtain sensitive information that can be used in future attacks.

Last week, it was reported by ZDNet that two security researchers at Security Research Labs (SRLabs) discovered that phishing and eavesdropping vectors are being used by hackers to “provide access to functions that developers can use to customize the commands to which a smart assistant responds, and the way the assistant replies.” The hackers can use the technology that Amazon and Google provides to app developers for the Alexa and Google Home products.

By putting certain commands into the back end of a normal Alexa/Google Home app, the attacker can silence the assistant for long periods of time, although the assistant is still active. After the silence, the attacker sends a phishing message, which makes the user believe had nothing to do with the app that they interacted with. The user is then asked for the Amazon/Google password and sends a fake message to the user that looks like it is from Amazon or Google. The user is then sent a message claiming to be from Amazon or Google and asking for the user’s password. Once the hacker has access to the home assistant, the hacker can eavesdrop on the user, keep the listening device active and record the users’ conversations. Obviously, when attackers eavesdrop on every word, even when it appears the device is turned off, they can obtain information that is highly personal and can be used malevolently in the future.

The manufacturers of the home smart assistants reiterate to users that the devices will never ask for their account password. Cyber hygiene for home assistants is no different than cyber hygiene with emails.


Copyright © 2019 Robinson & Cole LLP. All rights reserved.

For more hacking risk mitigation, see the National Law Review Communications, Media & Internet law page.

Can We Really Forget?

I expected this post would turn out differently.

I had intended to commend the European Court of Justice for placing sensible limits on the extraterritorial enforcement of the EU’s Right to be Forgotten. They did, albeit in a limited way,[1] and it was a good decision. There.  I did it. In 154 words.

Now for the remaining 1400 or so words.

But reading the decision pushes me back into frustration at the entire Right to be Forgotten regime and its illogical and destructive basis. The fact that a court recognizes the clear fact that the EU cannot (generally) force foreign companies to violate the laws of their own countries in internet sites that are intended for use within those countries (and NOT the EU), does not come close to offsetting the logical, practical and societal problems with the way the EU perceives and enforces the Right to be Forgotten.

As a lawyer, with all decisions grounded in the U.S. Constitution, I am comfortable with the First Amendment’s protection of Freedom of Speech – that nearly any truthful utterance or publication is inviolate, and that the foundation of our political and social system depends on open exposure of facts to sunlight. Intentionally shoving those true facts into the dark is wrong in our system and openness will be protected by U.S. courts.

Believe it or not, the European Union also has such a concept at the core of its foundation too. Article 10 of the European Convention on Human Rights states that:

“Everyone has the right to freedom of expression. This right shall include freedom to hold opinions and to receive and impart information and ideas without interference by public authority and regardless of frontiers.”

So we have the same values, right? In both jurisdictions the right to impart information can be exercised without interference by public authority.  Not so fast.  The EU contains a litany of restrictions on this right, including a limitation of your right to free speech by the policy to protect the reputation of others.

This seems like a complete evisceration of a right to open communication if a court can force obfuscation of facts just to protect someone’s reputation.  Does this person deserve a bad reputation? Has he or she committed a crime, failed to pay his or her debts, harmed animals or children, stalked an ex-lover, or violated an oath of office, marriage, priesthood or citizenship? It doesn’t much matter in the EU. The right of that person to hide his/her bad or dangerous behavior outweighs both the allegedly fundamental right to freedom to impart true information AND the public’s right to protect itself from someone who has proven himself/herself to be a risk to the community.

So how does this tension play out over the internet? In the EU, it is law that Google and other search engines must remove links to true facts about any wrongdoer who feels his/her reputation may be tarnished by the discovery of the truth about that person’s behavior. Get into a bar fight?  Don’t worry, the EU will put the entire force of law behind your request to wipe that off your record. Stiff your painting contractors for tens of thousands of Euros despite their good performance? Don’t worry, the EU will make sure nobody can find out . Get fired, removed from office or defrocked for dishonesty? Don’t worry, the EU has your back.

And that undercutting of speech rights has now been codified in Article 17 of Regulation 2016/679, the Right to be Forgotten.

And how does this new decision affect the rule? In the past couple weeks, the Grand Chamber of the EU Court of Justice issued an opinion limiting the extraterritorial reach of the Right to be Forgotten. (Google vs CNIL, Case C‑507/17) The decision confirms that search engines must remove links to certain embarrassing instances of true reporting, but must only do so on the versions of the search engine that are intentionally servicing the EU, and not necessarily in versions of the search engines for non-EU jurisdictions.

The problems with appointing Google to be an extrajudicial magistrate enforcing vague EU-granted rights under a highly ambiguous set of standards and then fining them when you don’t like a decision you forced them to make, deserve a separate post.

Why did we even need this decision? Because the French data privacy protection agency, known as CNIL, fined Google for not removing presumably true data from non-EU search results concerning, as Reuters described, “a satirical photomontage of a female politician, an article referring to someone as a public relations officer of the Church of Scientology, the placing under investigation of a male politician and the conviction of someone for sexual assaults against minors.”  So, to be clear, while the official French agency believes it should enforce a right for people to obscure that they have been convicted of sexual assault against children from the whole world, the Grand Chamber of the European Court of Justice believes that the people convicted child sexual assault should be protected in their right to obscure these facts only from people in Europe. This is progress.

Of course, in the U.S., politicians and other public figures, under investigation or subject to satire or people convicted of sexual assault against children do not have a right to protect their reputations by forcing Google to remove links to public records or stories in news outlets. We believe both that society is better when facts are allowed to be reported and disseminated and that society is protected by reporting on formal allegations against public figures or criminal convictions of private ones.

I am glad that the EU Court of Justice is willing to restrict rules to remain within its jurisdiction where they openly conflict with the basic laws of other jurisdictions. The Court sensibly held,

“The idea of worldwide de-referencing may seem appealing on the ground that it is radical, clear, simple and effective. Nonetheless, I do not find that solution convincing, because it takes into account only one side of the coin, namely the protection of a private person’s data.[2] . . . [T]he operator of a search engine is not required, when granting a request for de-referencing, to operate that de-referencing on all the domain names of its search engine in such a way that the links at issue no longer appear, regardless of the place from which the search on the basis of the requester’s name is carried out.”

Any other decision would be wildly overreaching. Believe me, every country in the EU would be howling in protest if the US decided that its views of personal privacy must be enforced in Europe by European companies due to operations aimed only to affect Europe. It should work both ways. So this was a well-reasoned limitation.

But I just cannot bring myself to be complimentary of a regime that I find so repugnant – where nearly any bad action can be swept under the rug in the name of protecting a person’s reputation.

As I have written in books and articles in the past, government protection of personal privacy is crucial for the clean and correct operation of a democracy.  However, privacy is also the obvious refuge of scoundrels – people prefer to keep the bad things they do private. Who wouldn’t? But one can go overboard protecting this right, and it feels like the EU has institutionalized its leap overboard.

I would rather err on the side of sunshine, giving up some privacy in the service of revealing the truth, than err on the side of darkness, allowing bad deeds to be obscured so that those who commit them can maintain their reputations.  Clearly, the EU doesn’t agree with me.


[1] The Court, in this case, wrote, “The issues at stake therefore do not require that the provisions of Directive 95/46 be applied outside the territory of the European Union. That does not mean, however, that EU law can never require a search engine such as Google to take action at worldwide level. I do not exclude the possibility that there may be situations in which the interest of the European Union requires the application of the provisions of Directive 95/46 beyond the territory of the European Union; but in a situation such as that of the present case, there is no reason to apply the provisions of Directive 95/46 in such a way.”

[2] EU Court of Justice case C-136/17, which states, “While the data subject’s rights [to privacy] override, as a general rule, the freedom of information of internet users, that balance may, however, depend, in specific cases, on the nature of the information in question and its sensitivity for the data subject’s private life and on the interest of the public in having that information. . . .”

 


Copyright © 2019 Womble Bond Dickinson (US) LLP All Rights Reserved.

For more EU’s GDPR enforcement, see the National Law Review Communications, Media & Internet law page.

Not So Fast And Furious – Executive Indicted for Stealing Self-Driving Car Trade Secrets

Back in March, 2017, we posted about a civil lawsuit against Anthony Levandowski, who allegedly sped off with a trove of trade secrets after resigning from Waymo LLC, Google’s self-driving technology company. Waymo not only sued Levandowski, but also his new employer, Uber, and another co-conspirator, Lior Ron. Since our initial post, things have gotten progressively worse for the Not So Fast and Furious trio: (1) Levandowski was fired in May, 2017; (2) Uber settled, giving up 5% of its stock, which totaled $245 million dollar;  and (3) the case against Levandowski and Ron was sent to arbitration, where the arbitration panel reportedly issued a $128 million interim award to Waymo.

Just when things couldn’t seem to get any worse, they did.

On August 15, 2019, a federal grand jury indicted Levandowski on 33 counts relating to trade secret theft. Levandowski has pled not guilty, has been released on $2 million dollars bail, and  is currently wearing an ankle monitor.

This legal saga is a reminder that trade secret theft is serious… it not only has civil consequences, but also criminal ones.  Unfortunately, trade secret theft happens every day.  And regardless of whether your company has trade secrets regarding self-driving car technology, worth hundreds of millions of dollars, or customer information worth less than a hundred thousand dollars, it’s important to make sure your company’s information is protected.

Equally important is knowing how to investigate potential trade secret theft. Some helpful tips as you launch your investigation:

1. Secure and preserve all relevant computing devices and email/file-sharing accounts.

2. Consider enlisting the help of outside computer forensic experts.

3. Analyze the employee’s computing activities on company computers and accounts.

4. Determine whether there is any abnormal file access, including during non-business hours.

5. Examine the employee’s use of external storage devices and whether those devices have been returned.

6. Review text message and call history from the employee’s company issued cell phone (and never instruct anyone to factory reset cell phones).

7. Enlist the help of outside counsel to set the parameters of the investigation.


© 2019 Jones Walker LLP
For more on trade secret law, see the National Law Review Intellectual Property law page.