Comparing and Contrasting the State Laws: Does Pseudonymized Data Exempt Organizations from Complying with Privacy Rights?

Some organizations are confused as to the impact that pseudonymization has (or does not have) on a privacy compliance program. That confusion largely stems from ambiguity concerning how the term fits into the larger scheme of modern data privacy statutes. For example, aside from the definition, the CCPA only refers to “pseudonymized” on one occasion – within the definition of “research” the CCPA implies that personal information collected by a business should be “pseudonymized and deidentified” or “deidentified and in the aggregate.”[1] The conjunctive reference to research being both pseudonymized “and” deidentified raises the question whether the CCPA lends any independent meaning to the term “pseudonymized.” Specifically, the CCPA assigns a higher threshold of anonymization to the term “deidentified.” As a result, if data is already deidentified it is not clear what additional processing or set of operations is expected to pseudonymize the data. The net result is that while the CCPA introduced the term “pseudonymization” into the American legal lexicon, it did not give it any significant legal effect or status.

Unlike the CCPA, the pseudonymization of data does impact compliance obligations under the data privacy statutes of Virginia, Colorado, and Utah. As the chart below indicates, those statutes do not require that organizations apply access or deletion rights to pseudonymized data, but do imply that other rights (e.g., opt out of sale) do apply to such data. Ambiguity remains as to what impact pseudonymized data has on rights that are not exempted, such as the right to opt out of the sale of personal information. For example, while Virginia does not require an organization to re-identify pseudonymized data, it is unclear how an organization could opt a consumer out of having their pseudonymized data sold without reidentification.


ENDNOTES

[1] Cal. Civ. Code § 1798.140(ab)(2) (West 2021). It should be noted that the reference to pseudonymizing and deidentifying personal information is found within the definition of the word “Research,” as such it is unclear whether the CCPA was attempting to indicate that personal information will not be considered research unless it has been pseudonymized and deidentified, or whether the CCPA is mandating that companies that conduct research must pseudonymize and deidentify. Given that the reference is found within the definition section of the CCPA, the former interpretation seems the most likely intent of the legislature.

[2] The GDPR does not expressly define the term “sale,” nor does it ascribe particular obligations to companies that sell personal information. Selling, however, is implicitly governed by the GDPR as any transfer of personal information from one controller to a second controller would be considered a processing activity for which a lawful purpose would be required pursuant to GDPR Article 6.

[3] Va. Code 59.1-577(B) (2022).

[4] Utah Code Ann. 13-61-303(1)(a) (2022).

[5] Va. Code 59.1-577(D) (2022) (exempting compliance with Va. Code 59.1-573(A)(1) through (4)

[6] C.R.S. 6-1-1307(3) (2022) (exempting compliance with C.R.S. Section 6-1-1306(1)(b) to (1)(e)).

[7] Utah Code Ann. 13-61-303(1)(c) (exempting compliance with Utah Code Ann. 13-61-202(1) through (3)).

[8] Va. Code 59.1-577(D) (2022) (exempting compliance with Va. Code 59.1-573(A)(1) through (4)

[9] C.R.S. 6-1-1307(3) (2022) (exempting compliance with C.R.S. Section 6-1-1306(1)(b) to (1)(e)).

[10] Va. Code 59.1-577(D) (2022) (exempting compliance with Va. Code 59.1-573(A)(1) through (4)

[11] C.R.S. 6-1-1307(3) (2022) (exempting compliance with C.R.S. Section 6-1-1306(1)(b) to (1)(e)).

[12] Utah Code Ann. 13-61-303(1)(c) (exempting compliance with Utah Code Ann. 13-61-202(1) through (3)).

[13] Va. Code 59.1-577(D) (2022) (exempting compliance with Va. Code 59.1-574).

[14] Va. Code 59.1-577(D) (2022) (exempting compliance with Va. Code 59.1-574).

©2022 Greenberg Traurig, LLP. All rights reserved.

Navigating the Data Privacy Landscape for Autonomous and Connected Vehicles: Best Practices

Autonomous and connected vehicles, and the data they collect, process and store, create high demands for strong data privacy and security policies. Accordingly, in-house counsel must define holistic data privacy best practices for consumer and B2B autonomous vehicles that balance compliance, safety, consumer protections and opportunities for commercial success against a patchwork of federal and state regulations.

Understanding key best practices related to the collection, use, storage and disposal of data will help in-house counsel frame balanced data privacy policies for autonomous vehicles and consumers. This is the inaugural article in our series on privacy policy best practices related to:

  1. Data collection

  2. Data privacy

  3. Data security

  4. Monetizing data

Autonomous and Connected Vehicles: Data Protection and Privacy Issues

The spirit of America is tightly intertwined with the concept of personal liberty, including freedom to jump in a car and go… wherever the road takes you. As the famous song claims, you can “get your kicks on Route 66.” But today you don’t just get your kicks. You also get terabytes of data on where you went, when you left and arrived, how fast you traveled to get there, and more.

Today’s connected and semi-autonomous vehicles are actively collecting 100x more data than a personal smartphone, precipitating a revolution that will drive changes not just to automotive manufacturing, but to our culture, economy, infrastructure, legal and regulatory landscapes.

As our cars are becoming computers, the volume and specificity of data collected continues to grow. The future is now. Or at least, very near. Global management consultant McKinsey estimates “full autonomy with Level 5 technology—operating anytime, anywhere” as soon as the next decade.

This near-term future isn’t only for consumer automobiles and ride-sharing robo taxis. B2B industries, including logistics and delivery, agriculture, mining, waste management and more are pursuing connected and autonomous vehicle deployments.

In-house counsel must balance evolving regulations at the federal and state level, as well as consider cross-border and international regulations for global technologies. In the United States, the Federal Trade Commission (FTC) is the regulatory agency governing data privacy, alongside individual states that are developing their own regulations, with the California Consumer Privacy Act (CCPA) leading the way. Virginia and Colorado have new laws coming into effect in 2022, the California Privacy Rights Act comes into effect in 2023, and a half dozen more states are expected to enact new privacy legislation in the near future.

While federal and state regulations continue to evolve, mobility companies in the consumer and B2B mobility sectors need to make decisions today about their own data privacy and security policies in order to optimize compliance and consumer protection with opportunities for commercial success.

Understanding Types of Connected and Autonomous Vehicles

Autonomous, semi-autonomous, self-driving, connected and networked cars; in this developing category, these descriptions are often used interchangeably in leading business and industry publications. B2B International defines “connected vehicles (CVs) [as those that] use the latest technology to communicate with each other and the world around them” whereas “autonomous vehicles (AVs)… are capable of recognizing their environment via the use of on-board sensors and global positioning systems in order to navigate with little or no human input. Examples of autonomous vehicle technology already in action in many modern cars include self-parking and auto-collision avoidance systems.”

But SAE International and the National Highway Traffic Safety Administration (NHTSA) go further, defining five levels of automation in self-driving cars.

Levels of Driving Automation™ in Self-Driving Cars

 

 

Level 3 and above autonomous driving is getting closer to reality every day because of an array of technologies, including: sensors, radar, sonar, lidar, biometrics, artificial intelligence and advanced computing power.

Approaching a Data Privacy Policy for Connected and Autonomous Vehicles

Because the mobility tech ecosystem is so dynamic, many companies, though well intentioned, inadvertently start with insufficient data privacy and security policies for their autonomous vehicle technology. The focus for these early and second stage companies is on bringing a product to market and, when sales accelerate, there is an urgent need to ensure their data privacy policies are comprehensive and compliant.

Whether companies are drafting initial policies or revising existing ones, there are general data principles that can guide policy development across the lifecycle of data:

Collect

Use

Store

Dispose

Only collect the data you need

Only use data for the reason you informed the consumer

Ensure reasonable data security protections are in place

Dispose the data when it’s no longer needed

Additionally, for many companies, framing autonomous and connected vehicle data protection and privacy issues through a safety lens can help determine the optimal approach to constructing policies that support the goals of the business while satisfying federal and state regulations.

For example, a company that monitors driver alertness (critical for safety in today’s Level 2 AV environment) through biometrics is, by design, collecting data on each driver who uses the car. This scenario clearly supports vehicle and driver safety while at the same time implicates U.S. data privacy law.

In the emerging regulatory landscape, in-house counsel will continue to be challenged to balance safety and privacy. Biometrics will become even more prevalent in connection to identification and authentication, along with other driver-monitoring technologies for all connected and autonomous vehicles, but particularly in relation to commercial fleet deployments.

Developing Best Practices for Data Privacy Policies

In-house counsel at autonomous vehicle companies are responsible for constructing their company’s data privacy and security policies. Best practices should be set around:

  • What data to collect and when

  • How collected data will be used

  • How to store collected data securely

  • Data ownership and monetization

Today, the CCPA sets the standard for rigorous consumer protections related to data ownership and privacy. However, in this evolving space, counsel will need to monitor and adjust their company’s practices and policies to comply with new regulations as they continue to develop in the U.S. and countries around the world.

Keeping best practices related to the collection, use, storage and disposal of data in mind will help in-house counsel construct policies that balance consumer protections with safety and the commercial goals of their organizations.

A parting consideration may be opportunistic, if extralegal: companies that choose to advocate strongly for customer protections may be afforded a powerful, positive opportunity to position themselves as responsible corporate citizens.

© 2022 Varnum LLP
For more articles about transportation, visit the NLR Public Services, Infrastructure, Transportation section.

The DOJ Throws Cold Water on the Frosties NFT Founders

The U.S. Attorney’s Office for the Southern District of New York recently charged two individuals for allegedly participating in a scheme to defraud purchasers of “Frosties” non-fungible tokens (or “NFTs”) out of over $1 million. The two-count complaint charges Ethan Nguyen (aka “Frostie”) and Andre Llacuna (aka “heyandre”) with conspiracy to commit wire fraud in violation of 18 U.S.C. § 1349 and conspiracy to commit money laundering in violation of 18 U.S.C. § 1956.   Each charge carries a maximum sentence of 20 years in prison.

The Defendants marketed “Frosties” as the entry point to a broader online community consisting of games, reward programs, and other benefits.  In January 2022, their “Frosties” pre-sale raised approximately $1.1 million.

In a so-called “rug pull,” Frostie and heyandre transferred the funds raised through the pre-sale to a series of separate cryptocurrency wallets, eliminated Frosties’ online presence, and took down its website.  The transaction, which was publicly recorded and viewable on the blockchain, triggered investors to sell Frosties at a considerable discount.  Frostie and heyandre then allegedly proceeded to move the funds through a series of transactions intended to obfuscate the source and increase anonymity.  The charges came as the Defendants were preparing for the March 26 pre-sale of their next NFT project, “Embers,” which law enforcement alleges would likely have followed the same course as “Frosties.”

In a public statement announcing the arrests, the DOJ explained how the emerging NFT market is a risk-laden environment that has attracted the attention of scam artists.  Representatives from each of the federal agencies that participated in the investigation cautioned the public and put other potential fraudsters on notice of the government’s watchful eye towards cryptocurrency malfeasance.

This investigation comes on the heels of the FBI’s announcement last month of the Virtual Asset Exploitation Unit, a special task force dedicated to blockchain analysis and virtual asset seizure.  The prosecution of the Defendants in this matter continues aggressive efforts by federal agencies to reign in bad actors participating in the cryptocurrency/digital assets/blockchain space.

Copyright ©2022 Nelson Mullins Riley & Scarborough LLP

WW International to Pay $1.5 Million Civil Penalty for Alleged COPPA Violations

In 2014, with childhood obesity on the rise in the United States, tech company Kurbo, Ltd. (Kurbo) marketed a free app for kids that, according to the company, was “designed to help kids and teens ages 8-17 reach a healthier weight.” When WW International (WW) (formerly Weight Watchers) acquired Kurbo in 2018, the app was rebranded “Kurbo by WW,” and WW continued to market the app to children as young as eight. But according to the Federal Trade Commission (FTC), Kurbo’s privacy practices were not exactly child-friendly, even if its app was. The FTC’s complaint, filed by the Department of Justice (DOJ) last month, claims that WW’s notice, data collection, and data retention practices violated the Children’s Online Privacy Protection Act Rule (COPPA Rule). WW and Kurbo, under a stipulated order, agreed to pay a $1.5 million civil penalty in addition to complying with a range of injunctive provisions. These provisions include, but are not limited to, deleting all personal information of children whose parents did not provide verifiable parental consent in a specified timeframe, and deleting “Affected Work Product” (defined in the order to include any models or algorithms developed in whole or in part using children’s personal information collected through the Kurbo Program).

Complaint Background

The COPPA Rule applies to any operator of a commercial website or online service directed to children that collects, uses, and/or discloses personal information from children and to any operator of a commercial website or online service that has actual knowledge that it collects, uses, and/or discloses personal information from children. Operators must notify parents and obtain their consent before collecting, using, or disclosing personal information from children under 13.

The complaint states that children enrolled in the Kurbo app by signing up through the app or having a parent do it on their behalf. Once on Kurbo, users could enter personal information such as height, weight, and age, and the app then tracked their weight, food consumption, and exercise. However, the FTC alleges that Kurbo’s age gate was porous, requiring no verification process to establish that children who affirmed they were over 13 were the age they claimed to be or that users asserting they were parents were indeed parents. In fact, the complaint alleges that the registration area featured a “tip-off” screen that gave visitors just two choices for registration: the “I’m a parent” option or the “I’m at least 13” option. Visitors saw the legend, “Per U.S. law, a child under 13 must sign up through a parent” on the registration page featuring these choices. In fact, thousands of users who indicated that they were at least 13 were younger and were able to change their information and falsify their real age. Users who lied about their age or who falsely claimed to be parents were able to continue to use the app. In 2020, after a warning from the FTC, Kurbo implemented a registration screen that removed the legend and the “at least 13” option. However, the new process failed to provide verification measures to establish that users claiming to be parents were indeed parents.

Kurbo’s notice of data collection and data retention practices also fell short. The COPPA Rule requires an operator to “post a prominent and clearly labeled link to an online notice of its information practices with regard to children on the home or landing page or screen of its Web site or online service, and, at each area of the Web site or online service where personal information is collected from children.” But beginning in November 2019, Kurbo’s notice at registration was buried in a list of hyperlinks that parents were not required to click through, and the notice failed to list all the categories of information the app collected from children. Further, Kurbo did not comply with the COPPA Rule’s mandate to keep children’s personal information only as long as reasonably necessary for the purpose it was collected and then to delete it. Instead, the company held on to personal information indefinitely unless parents specifically requested its removal.

Stipulated Order

In addition to imposing a $1.5 million civil penalty, the order, which was approved by the court on March 3, 2022, requires WW and Kurbo to:

  • Refrain from disclosing, using, or benefitting from children’s personal information collected in violation of the COPPA Rule;
  • Delete all personal information Kurbo collected in violation of the COPPA Rule within 30 days;
  • Provide a written statement to the FTC that details Kurbo’s process for providing notice and seeking verifiable parental consent;
  • Destroy all affected work product derived from improperly collecting children’s personal information and confirm to the FTC that deletion has been carried out;
  • Delete all children’s personal information collected within one year of the user’s last activity on the app; and
  • Create and follow a retention schedule that states the purpose for which children’s personal information is collected, the specific business need for retaining such information, and criteria for deletion, including a set timeframe no longer than one year.

Implications of the Order

Following the U.S. Supreme Court’s decision in AMG Capital Management, LLC v. Federal Trade Commission, which halted the FTC’s ability to use its Section 13(b) authority to seek monetary penalties for violations of the FTC Act, the FTC has been pushing Congress to grant it greater enforcement powers. In the meantime, the FTC has used other enforcement tools, including the recent resurrection of the agency’s long-dormant Penalty Offense Authority under Section 5(m)(1)(B) of the FTC Act and a renewed willingness to use algorithmic disgorgement (which the FTC first applied in the 2019 Cambridge Analytica case).

Algorithmic disgorgement involves “requir[ing] violators to disgorge not only the ill-gotten data, but also the benefits—here, the algorithms—generated from that data,” as then-Acting FTC Chair Rebecca Kelly Slaughter stated in a speech last year. This order appears to be the first time algorithmic disgorgement was applied by the Commission in an enforcement action under COPPA.

Children’s privacy issues continue to attract the attention of the FTC and lawmakers at both federal and state levels. Companies that collect children’s personal information should be careful to ensure that their privacy policies and practices fully conform to the COPPA Rule.

© 2022 Keller and Heckman LLP

New UK IDTA and Addendum Come Into Force

The new UK International Data Transfer Agreement (“IDTA”) and Addendum to the new 2021 EU Standard Contract Clauses (“New EU SCCs”) are now in force (as of the 21 March 2022), providing much needed certainty for UK organisations transferring personal data to service providers and group companies based outside of the UK/EEA.

The IDTA and Addendum replace the old EU Standard Contractual Clauses  (“Old EU SCCs”) for use as a UK GDPR-compliant transfer tool for restricted transfers from the UK, which also enables UK data exporters to comply with the European Court of Justice’s ‘Schrems II’ judgement.

For new UK data transfer arrangements or where UK organisations are in the process of reviewing their existing arrangements, use of the new ITDA or Addendum would be the best option to seek to future proof against the need to replace them in 2 years’ time.

Where the data flows involve transfers of personal data from both the UK and the EU, the use of the Addendum alongside the New EU SCCs, will enable organisations to implement a more harmonised solution.

To view copies of the documents please follow the links below:

To read our previous blog post on this topic, click here.


Article By Francesca Fellowes of Squire Patton Boggs (US) LLP. Hannah-Mei Grisley also contributed to this article.

© Copyright 2022 Squire Patton Boggs (US) LLP

Utah Becomes Fourth U.S. State to Enact Consumer Privacy Law

On March 24, 2022, Utah became the fourth state in the U.S., following California, Virginia and Colorado, to enact a consumer data privacy law, the Utah Consumer Privacy Act (the “UCPA”). The UCPA resembles Virginia’s Consumer Data Protection Act (“VCDPA”) and Colorado’s Consumer Privacy Act (“CPA”), and, to a lesser extent, the California Consumer Privacy Act (as amended by the California Privacy Rights Act) (“CCPA/CPRA”). The UCPA will take effect on December 31, 2023.

The UCPA applies to a controller or processor that (1) conducts business in Utah or produces a product or service targeted to Utah residents; (2) has annual revenue of $25,000,000 or more; and (3) satisfies at least one of the following thresholds: (a) during a calendar year, controls or processes the personal data of 100,000 or more Utah residents, or (b) derives over 50% of its gross revenue from the sale of personal data, and controls or processes the personal data of 25,000 or more consumers.

As with the CPA and VCDPA, the UCPA’s protections apply only to Utah residents acting solely within their individual or household context, with an express exemption for individuals acting in an employment or commercial (B2B) context. Similar to the CPA and VCDPA, the UCPA contains exemptions for covered entities, business associates and protected health information subject to the Health Insurance Portability and Accountability Act of 1996 (“HIPAA”), and financial institutions or personal data subject to the Gramm-Leach-Bliley Act (“GLB”). As with the CCPA/CPRA and VCDPA, the UCPA also exempts from its application non-profit entities.

In line with the CCPA/CPRA, CPA and VCDPA, the UCPA provides Utah consumers with certain rights, including the right to access their personal data, delete their personal data, obtain a copy of their personal data in a portable manner, opt out of the “sale” of their personal data, and opt out of “targeted advertising” (as each term is defined under the law). Notably, the UCPA adopts the VCDPA’s more narrow definition of “sale,” which is limited to the exchange of personal data for monetary consideration by a controller to a third party. Unlike the CCPA/CPRA, CPA and VCDPA, the UCPA will not provide Utah consumers with the ability to correct inaccuracies in their personal data. Also unlike the CPA and VCDPA, the UCPA will not require controllers to obtain prior opt-in consent to process “sensitive data” (i.e., racial or ethnic origin, religious beliefs, sexual orientation, citizenship or immigration status, medical or health information, genetic or biometric data, or geolocation data). It will, however, require controllers to first provide consumers with clear notice and an opportunity to opt out of the processing of his or her sensitive data. With respect to the processing of personal data “concerning a known child” (under age 13), controllers must process such data in accordance with the Children’s Online Privacy Protection Act. The UCPA will prohibit controllers from discriminating against consumers for exercising their rights.

In addition, the UCPA will require controllers to implement reasonable and appropriate data security measures, provide certain content in their privacy notices, and include specific language in contracts with processors.

Unlike the CCPA/CPRA, VCDPA and CPA, the UCPA will not require controllers to conduct data protection assessments prior to engaging in data processing activities that present a heightened risk of harm to consumers, or to conduct cybersecurity audits or risk assessments.

In line with existing U.S. state privacy laws, the UCPA does not provide for a private right of action. The law will be enforced by the Utah Attorney General.

Copyright © 2022, Hunton Andrews Kurth LLP. All Rights Reserved.

EDPB on Dark Patterns: Lessons for Marketing Teams

“Dark patterns” are becoming the target of EU data protection authorities, and the new guidelines of the European Data Protection Board (EDPB) on “dark patterns in social media platform interfaces” confirm their focus on such practices. While they are built around examples from social media platforms (real or fictitious), these guidelines contain lessons for all websites and applications. The bad news for marketers: the EDPB doesn’t like it when dry legal texts and interfaces are made catchier or more enticing.

To illustrate, in a section of the guidelines regarding the selection of an account profile photo, the EDPB considers the example of a “help/information” prompt saying “No need to go to the hairdresser’s first. Just pick a photo that says ‘this is me.’” According to the EDPB, such a practice “can impact the final decision made by users who initially decided not to share a picture for their account” and thus makes consent invalid under the General Data Protection Regulation (GDPR). Similarly, the EDPB criticises an extreme example of a cookie banner with a humourous link to a bakery cookies recipe that incidentally says, “we also use cookies”, stating that “users might think they just dismiss a funny message about cookies as a baked snack and not consider the technical meaning of the term “cookies.”” The EDPB even suggests that the data minimisation principle, and not security concerns, should ultimately guide an organisation’s choice of which two-factor authentication method to use.

Do these new guidelines reflect privacy paranoia or common sense? The answer should lie somewhere in between, but the whole document (64 pages long) in our view suggests an overly strict approach, one that we hope will move closer to commonsense as a result of a newly started public consultation process.

Let us take a closer look at what useful lessons – or warnings – can be drawn from these new guidelines.

What are “dark patterns” and when are they unlawful?

According to the EDPB, dark patterns are “interfaces and user experiences […] that lead users into making unintended, unwilling and potentially harmful decisions regarding the processing of their personal data” (p. 2). They “aim to influence users’ behaviour and can hinder their ability to effectively protect their personal data and make conscious choices.” The risk associated with dark patterns is higher for websites or applications meant for children, as “dark patterns raise additional concerns regarding potential impact on children” (p. 8).

While the EDPB takes a strongly negative view of dark patterns in general, it recognises that dark patterns do not automatically lead to an infringement of the GDPR. The EDPB acknowledges that “[d]ata protection authorities are responsible for sanctioning the use of dark patterns if these breach GDPR requirements” (emphasis ours; p. 2). Nevertheless, the EDPB guidance strongly links the concept of dark patterns with the data protection by design and by default principles of Art. 25 GDPR, suggesting that disregard for those principles could lead to a presumption that the language or a practice in fact creates a “dark pattern” (p. 11).

The EDPB refers here to its Guidelines 4/2019 on Article 25 Data Protection by Design and by Default and in particular to the following key principles:

  • “Autonomy – Data subjects should be granted the highest degree of autonomy possible to determine the use made of their personal data, as well as autonomy over the scope and conditions of that use or processing.
  • Interaction – Data subjects must be able to communicate and exercise their rights in respect of the personal data processed by the controller.
  • Expectation – Processing should correspond with data subjects’ reasonable expectations.
  • Consumer choice – The controllers should not “lock in” their users in an unfair manner. Whenever a service processing personal data is proprietary, it may create a lock-in to the service, which may not be fair, if it impairs the data subjects’ possibility to exercise their right of data portability in accordance with Article 20 GDPR.
  • Power balance – Power balance should be a key objective of the controller-data subject relationship. Power imbalances should be avoided. When this is not possible, they should be recognised and accounted for with suitable countermeasures.
  • No deception – Data processing information and options should be provided in an objective and neutral way, avoiding any deceptive or manipulative language or design.
  • Truthful – the controllers must make available information about how they process personal data, should act as they declare they will and not mislead data subjects.”

Is data minimisation compatible with the use of SMS two-factor authentication?

One of the EDPB’s positions, while grounded in the principle of data minimisation, undercuts a security practice that has grown significantly over the past few years. In effect, the EDPB seems to question the validity under the GDPR of requests for phone numbers for two-factor authentication where e-mail tokens would theoretically be possible:

“30. To observe the principle of data minimisation, [organisations] are required not to ask for additional data such as the phone number, when the data users already provided during the sign- up process are sufficient. For example, to ensure account security, enhanced authentication is possible without the phone number by simply sending a code to users’ email accounts or by several other means.
31. Social network providers should therefore rely on means for security that are easier for users to re[1]initiate. For example, the [organisation] can send users an authentication number via an additional communication channel, such as a security app, which users previously installed on their mobile phone, but without requiring the users’ mobile phone number. User authentication via email addresses is also less intrusive than via phone number because users could simply create a new email address specifically for the sign-up process and utilise that email address mainly in connection with the Social Network. A phone number, however, is not that easily interchangeable, given that it is highly unlikely that users would buy a new SIM card or conclude a new phone contract only for the reason of authentication.” 
(emphasis ours; p. 15)

The EDPB also appears to be highly critical of phone-based verification in the context of registration “because the email address constitutes the regular contact point with users during the registration process” (p. 15).

This position is unfortunate, as it suggests that data minimisation may preclude controllers from even assessing which method of two-factor authentication – in this case, e-mail versus SMS one-time passwords – better suits its requirements, taking into consideration the different security benefits and drawbacks of the two methods. The EDPB’s reasoning could even be used to exclude any form of stronger two-factor authentication, as additional forms inevitably require separate processing (e.g., phone number or third-party account linking for some app-based authentication methods).

For these reasons, organisations should view this aspect of the new EDPB guidelines with a healthy dose of skepticism. It likewise will be important for interested stakeholders to participate in the consultation to explain the security benefits of using phone numbers to keep the “two” in two-factor authentication.

Consent withdrawal: same number of clicks?

Recent decisions by EU regulators (notably two decisions by the French authority, the CNIL have led to speculation about whether EU rules effectively require website operators to make it possible for data subjects to withdraw consent to all cookies with one single click, just as most websites make it possible to give consent through a single click. The authorities themselves have not stated that this is unequivocally required, although privacy activists notably filed complaints against hundreds of websites, many of them for not including a “reject all” button on their cookie banner.

The EDPB now appears to side with the privacy activists in this respect, stating that “consent cannot be considered valid under the GDPR when consent is obtained through only one mouse-click, swipe or keystroke, but the withdrawal takes more steps, is more difficult to achieve or takes more time” (p. 14).

Operationally, however, it seems impossible to comply with a “one-click withdrawal” standard in absolute terms. Just pulling up settings after registration or after the first visit to a website will always require an extra click, purely to open those settings. We expect this issue to be examined by the courts eventually.

Is creative wording indicative of a “dark pattern”?

The EDPB’s guidelines contain several examples of wording that is intended to convince the user to take a specific action.

The photo example mentioned in the introduction above is an illustration, but other (likely fictitious) examples include the following:

  • For sharing geolocation data: “Hey, a lone wolf, are you? But sharing and connecting with others help make the world a better place! Share your geolocation! Let the places and people around you inspire you!” (p.17)
  • To prompt a user to provide a self-description: “Tell us about your amazing self! We can’t wait, so come on right now and let us know!” (p. 17)

The EDPB criticises the language used, stating that it is “emotional steering”:

“[S]uch techniques do not cultivate users’ free will to provide their data, since the prescriptive language used can make users feel obliged to provide a self-description because they have already put time into the registration and wish to complete it. When users are in the process of registering to an account, they are less likely to take time to consider the description they give or even if they would like to give one at all. This is particularly the case when the language used delivers a sense of urgency or sounds like an imperative. If users feel this obligation, even when in reality providing the data is not mandatory, this can have an impact on their “free will”” (pp. 17-18).

Similarly, in a section about account deletion and deactivation, the EDPB criticises interfaces that highlight “only the negative, discouraging consequences of deleting their accounts,” e.g., “you’ll lose everything forever,” or “you won’t be able to reactivate your account” (p. 55). The EDPB even criticises interfaces that preselect deactivation or pause options over delete options, considering that “[t]he default selection of the pause option is likely to nudge users to select it instead of deleting their account as initially intended. Therefore, the practice described in this example can be considered as a breach of Article 12 (2) GDPR since it does not, in this case, facilitate the exercise of the right to erasure, and even tries to nudge users away from exercising it” (p. 56). This, combined with the EDPB’s aversion to confirmation requests (see section 5 below), suggests that the EDPB is ignoring the risk that a data subject might opt for deletion without fully recognizing the consequences, i.e., loss of access to the deleted data.

The EDPB’s approach suggests that any effort to woo users into giving more data or leaving data with the organisation will be viewed as harmful by data protection authorities. Yet data protection rules are there to prevent abuse and protect data subjects, not to render all marketing techniques illegal.

In this context, the guidelines should in our opinion be viewed as an invitation to re-examine marketing techniques to ensure that they are not too pushy – in the sense that users would in effect truly be pushed into a decision regarding personal data that they would not otherwise have made. Marketing techniques are not per se unlawful under the GDPR but may run afoul of GDPR requirements in situations where data subjects are misled or robbed of their choice.

Other key lessons for marketers and user interface designers

  • Avoid continuous prompting: One of the issues regularly highlighted by the EDPB is “continuous prompting”, i.e., prompts that appear again and again during a user’s experience on a platform. The EDPB suggests that this creates fatigue, leading the user to “give in,” i.e., by “accepting to provide more data or to consent to another processing, as they are wearied from having to express a choice each time they use the platform” (p. 14). Examples given by the EDPB include the SMS two-factor authentication popup mentioned above, as well as “import your contacts” functionality. Outside of social media platforms, the main example for most organisations is their cookie policy (so this position by the EDPB reinforces the need to manage cookie banners properly). In addition, newsletter popups and popups about “how to get our new report for free by filling out this form” are frequent on many digital properties. While popups can be effective ways to get more subscribers or more data, the EDPB guidance suggests that regulators will consider such practices questionable from a data protection perspective.
  • Ensure consistency or a justification for confirmation steps: The EDPB highlights the “longer than necessary” dark pattern at several places in its guidelines (in particular pp. 18, 52, & 57), with illustrations of confirmation pop-ups that appear before a user is allowed to select a more privacy-friendly option (and while no such confirmation is requested for more privacy-intrusive options). Such practices are unlawful according to the EDPB. This does not mean that confirmation pop-ups are always unlawful – just that you need to have a good justification for using them where you do.
  • Have a good reason for preselecting less privacy-friendly options: Because the GDPR requires not only data protection by design but also data protection by default, make sure that you are able to justify an interface in which a more privacy-intrusive option is selected by default – or better yet, don’t make any preselection. The EDPB calls preselection of privacy-intrusive options “deceptive snugness” (“Because of the default effect which nudges individuals to keep a pre-selected option, users are unlikely to change these even if given the possibility” p. 19).
  • Make all privacy settings available in all platforms: If a user is asked to make a choice during registration or upon his/her first visit (e.g., for cookies, newsletters, sharing preferences, etc.), ensure that those settings can all be found easily later on, from a central privacy settings page if possible, and alongside all data protection tools (such as tools for exercising a data subject’s right to access his/her data, to modify data, to delete an account, etc.). Also make sure that all such functionality is available not only on a desktop interface but also for mobile devices and across all applications. The EDPB illustrates this point by criticising the case where an organisation has a messaging app that does not include the same privacy statement and data subject request tools as the main app (p. 27).
  • Be clearer in using general language such as “Your data might be used to improve our services”: It is common in most privacy statements to include a statement that personal data (e.g., customer feedback) “can” or “may be used” to improve an organisation’s products and services. According to the EDPB, the word “services” is likely to be “too general” to be viewed as “clear,” and it is “unclear how data will be processed for the improvement of services.” The use of the conditional tense in the example (“might”) also “leaves users unsure whether their data will be used for the processing or not” (p. 25). Given that the EDPB’s stance in this respect is a confirmation of a position taken by EU regulators in previous guidance on transparency, and serves as a reminder to tell data subjects how data will be used.
  • Ensure linguistic consistency: If your website or app is available in more than one language, ensure that all data protection notices and tools are available in those languages as well and that the language choice made on the main interface is automatically taken into account on the data-related pages (pp. 25-26).

Best practices according to the EDPB

Finally, the EDPB highlights some other “best practices” throughout its guidelines. We have combined them below for easier review:

  • Structure and ease of access:
    • Shortcuts: Links to information, actions, or settings that can be of practical help to users to manage their data and data protection settings should be available wherever they relate to information or experience (e.g., links redirecting to the relevant parts of the privacy policy; in the case of a data breach communication to users, to provide users with a link to reset their password).
    • Data protection directory: For easy navigation through the different section of the menu, provide users with an easily accessible page from where all data protection-related actions and information are accessible. This page could be found in the organisation’s main navigation menu, the user account, through the privacy policy, etc.
    • Privacy Policy Overview: At the start/top of the privacy policy, include a collapsible table of contents with headings and sub-headings that shows the different passages the privacy notice contains. Clearly identified sections allow users to quickly identify and jump to the section they are looking for.
    • Sticky navigation: While consulting a page related to data protection, the table of contents could be constantly displayed on the screen allowing users to quickly navigate to relevant content thanks to anchor links.
  • Transparency:
    • Organisation contact information: The organisation’s contact address for addressing data protection requests should be clearly stated in the privacy policy. It should be present in a section where users can expect to find it, such as a section on the identity of the data controller, a rights related section, or a contact section.
    • Reaching the supervisory authority: Stating the specific identity of the EU supervisory authority and including a link to its website or the specific website page for lodging a complaint is another EDPB recommendation. This information should be present in a section where users can expect to find it, such as a rights-related section.
    • Change spotting and comparison: When changes are made to the privacy notice, make previous versions accessible with the date of release and highlight any changes.
  • Terminology & explanations:
    • Coherent wording: Across the website, the same wording and definition is used for the same data protection concepts. The wording used in the privacy policy should match that used on the rest of the platform.
    • Providing definitions: When using unfamiliar or technical words or jargon, providing a definition in plain language will help users understand the information provided to them. The definition can be given directly in the text when users hover over the word and/or be made available in a glossary.
    • Explaining consequences: When users want to activate or deactivate a data protection control, or give or withdraw their consent, inform them in a neutral way of the consequences of such action.
    • Use of examples: In addition to providing mandatory information that clearly and precisely states the purpose of processing, offering specific data processing examples can make the processing more tangible for users
  • Contrasting Data Protection Elements: Making data protection-related elements or actions visually striking in an interface that is not directly dedicated to the matter helps readability. For example, when posting a public message on the platform, controls for geolocation should be directly available and clearly visible.
  • Data Protection Onboarding: Just after the creation of an account, include data protection points within the onboarding experience for users to discover and set their preferences seamlessly. This can be done by, for example, inviting them to set their data protection preferences after adding their first friend or sharing their first post.
  • Notifications (including data breach notifications): Notifications can be used to raise awareness of users of aspects, changes, or risks related to personal data processing (e.g., when a data breach occurs). These notifications can be implemented in several ways, such as through inbox messages, pop-in windows, fixed banners at the top of the webpage, etc.

Next steps and international perspectives

These guidelines (available online) are subject to public consultation until 2 May 2022, so it is possible they will be modified as a result of the consultation and, we hope, improved to reflect a more pragmatic view of data protection that balances data subjects’ rights, security, and operational business needs. If you wish to contribute to the public consultation, note that the EDPB publishes feedback it receives (as a result, we have occasionally submitted feedback on behalf of clients wishing to remain anonymous).

Irrespective of the outcome of the public consultation, the guidelines are guaranteed to have an influence on the approach of EU data protection authorities in their investigations. From this perspective, it is better to be forewarned – and to have legal arguments at your disposal if you wish to adopt an approach that deviates from the EDPB’s position.

Moreover, these guidelines come at a time when the United States Federal Trade Commission (FTC) is also concerned with dark patterns. The FTC recently published an enforcement policy statement on the matter in October 2021. Dark patterns are also being discussed at the Organisation for Economic Cooperation and Development (OECD). International dialogue can be helpful if conversations about desired policy also consider practical solutions that can be implemented by businesses and reflect a desirable user experience for data subjects.

Organisations should consider evaluating their own techniques to encourage users to go one way or another and document the justification for their approach.

© 2022 Keller and Heckman LLP

Google to Launch Google Analytics 4 in an Attempt to Address EU Privacy Concerns

On March 16, 2022, Google announced the launch of its new analytics solution, “Google Analytics 4.” Google Analytics 4 aims, among other things, to address recent developments in the EU regarding the use of analytics cookies and data transfers resulting from such use.

Background

On August 17, 2020, the non-governmental organization None of Your Business (“NOYB”) filed 101 identical complaints with 30 European Economic Area data protection authorities (“DPAs”) regarding the use of Google Analytics by various companies. The complaints focused on whether the transfer of EU personal data to Google in the U.S. through the use of cookies is permitted under the EU General Data Protection Regulation (“GDPR”), following the Schrems II judgment of the Court of Justice of the European Union. Following these complaints, the French and Austrian DPAs ruled that the transfer of EU personal data from the EU to the U.S. through the use of the Google Analytics cookie is unlawful.

Google’s New Solution

According to Google’s press release, Google Analytics 4 “is designed with privacy at its core to provide a better experience for both our customers and their users. It helps businesses meet evolving needs and user expectations, with more comprehensive and granular controls for data collection and usage.”

The most impactful change from an EU privacy standpoint is that Google Analytics 4 will no longer store IP address, thereby limiting the data transfers resulting from the use of Google Analytics that were under scrutiny in the EU following the Schrems II ruling. It remains to be seen whether this change will ease EU DPAs’ concerns about Google Analytics’ compliance with the GDPR.

Google’s previous analytics solution, Universal Analytics, will no longer be available beginning July 2023. In the meantime, companies are encouraged to transition to Google Analytics 4.

Read Google’s press release.

Copyright © 2022, Hunton Andrews Kurth LLP. All Rights Reserved.

Chinese APT41 Attacking State Networks

Although we are receiving frequent alerts from CISA and the FBI about the potential for increased cyber threats coming out of Russia, China continues its cyber threat activity through APT41, which has been linked to China’s Ministry of State Security. According to Mandiant, APT41 has launched a “deliberate campaign targeting U.S. state governments” and has successfully attacked at least six state government networks by exploiting various vulnerabilities, including Log4j.

According to Mandiant, although the Chinese-based hackers are kicked out of state government networks, they repeat the attack weeks later and keep trying to get in to the same networks via different vulnerabilities (a “re-compromise”). One such successful vulnerability that was utilized is the USAHerds zero-day vulnerability, which is a software that state agriculture agencies use to monitor livestock. When the intruders are successful in using the USAHerds vulnerability to get in to the network, they can then leverage the intrusion to migrate to other parts of the network to access and steal information, including personal information.

Mandiant’s outlook on these attacks is sobering:

“APT41’s recent activity against U.S. state governments consists of significant new capabilities, from new attack vectors to post-compromise tools and techniques. APT41 can quickly adapt their initial access techniques by re-compromising an environment through a different vector, or by rapidly operationalizing a fresh vulnerability. The group also demonstrates a willingness to retool and deploy capabilities through new attack vectors as opposed to holding onto them for future use. APT41 exploiting Log4J in close proximity to the USAHerds campaign showed the group’s flexibility to continue targeting U.S state governments through both cultivated and co-opted attack vectors. Through all the new, some things remain unchanged: APT41 continues to be undeterred by the U.S. Department of Justice (DOJ) indictment in September 2020.

Both Russia and China continue to conduct cyber-attacks against both private and public networks in the U.S. and there is no indication that the attacks will subside anytime soon.

Copyright © 2022 Robinson & Cole LLP. All rights reserved.

Securities Litigation: An Emerging Strategy to Hold Companies Accountable for Privacy Protections

A California federal judge rejected Zoom Video Communications, Inc.’s motion to dismiss securities fraud claims against it, and its CEO and CFO, for misrepresenting Zoom’s privacy protections. Although there have been a number of cases challenging inadequate privacy protections on consumer protection grounds in recent years, this decision shifts the spotlight to an additional front on which the battles for privacy protection may be fought:  the securities-litigation realm.

At issue were statements made by Zoom relating to the company’s privacy and encryption methods, including Zoom’s 2019 Registration Statement and Prospectus, which told investors the company offered “robust security capabilities, including end-to-end encryption.” Importantly, the prospectus was signed by Zoom’s CEO, Eric Yuan. The plaintiffs, a group of Zoom shareholders, brought suit arguing that end-to-end encryption means that only meeting participants and no other person, not even the platform provider, would be able to access the content. The complaint alleged that contrary to this statement, Zoom maintained access to the cryptographic keys that could allow it to access the unencrypted video and audio content of Zoom meetings.

The plaintiffs’ allegations are based on media reports of security issues relating to Zoom conferences early in the COVID-19 pandemic, as well as an April 2020 Zoom blog post in which Yuan stated that Zoom had “fallen short of the community’s  ̶ ̶  and our own  ̶ ̶  privacy and security expectations.”  In his post, Yuan linked to another Zoom executive’s post, which apologized for “incorrectly suggesting” that Zoom meetings used end-to-end encryption.

In their motion to dismiss, the defendants did not dispute that the company said it used end-to-end encryption.  Instead, they challenged plaintiffs’ falsity, scienter, and loss causation allegations – and all three attempts were rejected by the court.

First, as to falsity, the court did not buy the defendants’ argument that “end-to-end encryption” could have different meanings because a Zoom executive expressly acknowledged that the company had “incorrectly suggest[ed] that Zoom meetings were capable of using end-to-end encryption.”  Thus, the court found that the complaint did, in fact, plead the existence of materially false and misleading statements. The court also rejected the defendants’ argument that Yuan’s understanding of the term “end-to-end encryption” changed in a relevant way from the time he made the challenged representation to his later statements that Zoom’s usage was inconsistent with “the commonly accepted definition.” The court looked to Yuan’s advanced degree in engineering, his status as a “founding engineer” at WebEx, and that he had personally “led the effort to engineer Zoom Meetings’ platform and is named on several patents that specifically concern encryption techniques.”

Lastly, the court rebuffed the defendants’ attempt at undermining loss causation, finding that the plaintiffs had pled facts to plausibly suggest a causal connection between the defendants’ allegedly fraudulent conduct and the plaintiffs’ economic loss. In particular, the court referenced the decline in Zoom’s stock price shortly after defendants’ fraud was revealed to the market via media reports and Yuan’s blog post.

That said, the court dismissed the plaintiffs’ remaining claims, as they related to data privacy statements made by Zoom or, in general, by the “defendants,” unlike the specific encryption-related statement made by Yuan. The court found that the corporate-made statements did not rise to the level of an “exceptional case where a company’s public statements were so important and so dramatically false that they would create a strong inference that at least some corporate officials knew of the falsity upon publication.” Because those statements were not coupled with sufficient allegations of individual scienter, the court granted the defendants’ motion to dismiss those statements from the complaint.

© 2022 Proskauer Rose LLP.
For more articles about business litigation, visit the NLR Litigation section.