Securities Litigation: An Emerging Strategy to Hold Companies Accountable for Privacy Protections

A California federal judge rejected Zoom Video Communications, Inc.’s motion to dismiss securities fraud claims against it, and its CEO and CFO, for misrepresenting Zoom’s privacy protections. Although there have been a number of cases challenging inadequate privacy protections on consumer protection grounds in recent years, this decision shifts the spotlight to an additional front on which the battles for privacy protection may be fought:  the securities-litigation realm.

At issue were statements made by Zoom relating to the company’s privacy and encryption methods, including Zoom’s 2019 Registration Statement and Prospectus, which told investors the company offered “robust security capabilities, including end-to-end encryption.” Importantly, the prospectus was signed by Zoom’s CEO, Eric Yuan. The plaintiffs, a group of Zoom shareholders, brought suit arguing that end-to-end encryption means that only meeting participants and no other person, not even the platform provider, would be able to access the content. The complaint alleged that contrary to this statement, Zoom maintained access to the cryptographic keys that could allow it to access the unencrypted video and audio content of Zoom meetings.

The plaintiffs’ allegations are based on media reports of security issues relating to Zoom conferences early in the COVID-19 pandemic, as well as an April 2020 Zoom blog post in which Yuan stated that Zoom had “fallen short of the community’s  ̶ ̶  and our own  ̶ ̶  privacy and security expectations.”  In his post, Yuan linked to another Zoom executive’s post, which apologized for “incorrectly suggesting” that Zoom meetings used end-to-end encryption.

In their motion to dismiss, the defendants did not dispute that the company said it used end-to-end encryption.  Instead, they challenged plaintiffs’ falsity, scienter, and loss causation allegations – and all three attempts were rejected by the court.

First, as to falsity, the court did not buy the defendants’ argument that “end-to-end encryption” could have different meanings because a Zoom executive expressly acknowledged that the company had “incorrectly suggest[ed] that Zoom meetings were capable of using end-to-end encryption.”  Thus, the court found that the complaint did, in fact, plead the existence of materially false and misleading statements. The court also rejected the defendants’ argument that Yuan’s understanding of the term “end-to-end encryption” changed in a relevant way from the time he made the challenged representation to his later statements that Zoom’s usage was inconsistent with “the commonly accepted definition.” The court looked to Yuan’s advanced degree in engineering, his status as a “founding engineer” at WebEx, and that he had personally “led the effort to engineer Zoom Meetings’ platform and is named on several patents that specifically concern encryption techniques.”

Lastly, the court rebuffed the defendants’ attempt at undermining loss causation, finding that the plaintiffs had pled facts to plausibly suggest a causal connection between the defendants’ allegedly fraudulent conduct and the plaintiffs’ economic loss. In particular, the court referenced the decline in Zoom’s stock price shortly after defendants’ fraud was revealed to the market via media reports and Yuan’s blog post.

That said, the court dismissed the plaintiffs’ remaining claims, as they related to data privacy statements made by Zoom or, in general, by the “defendants,” unlike the specific encryption-related statement made by Yuan. The court found that the corporate-made statements did not rise to the level of an “exceptional case where a company’s public statements were so important and so dramatically false that they would create a strong inference that at least some corporate officials knew of the falsity upon publication.” Because those statements were not coupled with sufficient allegations of individual scienter, the court granted the defendants’ motion to dismiss those statements from the complaint.

© 2022 Proskauer Rose LLP.
For more articles about business litigation, visit the NLR Litigation section.

GDPR Privacy Rules: The Other Shoe Drops

Four years after GDPR was implemented, we are seeing the pillars of the internet business destroyed. Given two new EU decisions affecting the practical management of data, all companies collecting consumer data in the EU are re-evaluating their business models and will soon be considering wholesale changes.

On one hand, the GDPR is creating the world its drafters intended – a world where personal data is less of a commodity exploited and traded by business. On the other hand, GDPR enforcement has taken the form of a wrecking ball, leading to data localization in Europe and substitution of government meddling for consumer choice.

For years we have watched the EU courts and enforcement agencies apply GDPR text to real-life cases, wondering if the legal application would be more of a nip and tuck operation on ecommerce or something more bloody and brutal. In 2022, we received our answer, and the bodies are dropping.

In January Austrian courts decided that companies can’t use Google Analytics to study their own site’s web traffic. The same conclusion was reached last week by French regulators. While Google doesn’t announce statistics about product usage, website tracker BuiltWith published that 29.3 million websites use Google Analytics, including 69.5 percent of Quantcast’s Top 10,000 sites, and that is more than ten times the next most popular option. So vast numbers of companies operating in Europe will need to change their platform analytics provider – if the Euro-crats will allow them to use site analytics at all.

But these decisions were not based on the functionality of Google Analytics, a tool that does not even capture personally identifiable information – no names, no home or office address, no phone numbers. Instead, these decisions that will harm thousands of businesses were a result of the Schrems II decision, finding fault in the transfer of this non-identifiable data to a company based in the United States. The problem here for European decision-makers is that US law enforcement may have access to this data if courts allow them. I have written before about this illogical conclusion and won’t restate the many arguments here, other than to say that EU law enforcement behaves the same way.

The effects of this decision will be felt far beyond the huge customer base of Google Analytics.  The logic of this decision effectively means that companies collecting data from EU citizens can no longer use US-based cloud services like Amazon Web Services, IBM, Google, Oracle or Microsoft. I would anticipate that huge cloud player Alibaba Cloud could suffer the same proscription if Europe’s privacy panjandrums decide that China’s privacy protection is as threatening as the US.

The Austrians held that all the sophisticated measures taken by Google to encrypt analytic data meant nothing, because if Google could decrypt it, so could the US government. By this logic, no US cloud provider – the world’s primary business data support network – could “safely” hold EU data. Which means that the Euro-crats are preparing to fine any EU company that uses a US cloud provider. Max Schrems saw this decision in stark terms, stating, “The bottom line is: Companies can’t use US cloud services in Europe anymore.”

This decision will ultimately support the Euro-crats’ goal of data localization as companies try to organize local storage/processing solutions to avoid fines. Readers of this blog have seen coverage of the EU’s tilt toward data localization (for example, here and here) and away from the open internet that European politicians once held as the ideal. The Euro-crats are taking serious steps toward forcing localized data processing and cutting US businesses out of the ecommerce business ecosystem. The Google Analytics decision is likely to be seen as a tipping point in years to come.

In a second major practical online privacy decision, earlier this month the Belgian Data Protection Authority ruled that the Interactive Advertising Bureau Europe’s Transparency and Consent Framework (TCF), a widely-used technical standard built for publishers, advertisers, and technology vendors to obtain user consent for data processing, does not comply with the GDPR. The TCF allows users to accept or reject cookie-based advertising, relieving websites of the need to create their own expensive technical solutions, and creating a consistent experience for consumers. Now the TCF is considered per-se illegal under EU privacy rules, casting thousands of businesses to search for or design their own alternatives, and removing online choices for European residents.

The Belgian privacy authority reached this conclusion by holding that the Interactive Advertising Bureau was a “controller” of all the data managed under its proposed framework. As stated by the Center for Data Innovation, this decision implies “that any good-faith effort to implement a common data protection protocol by an umbrella organization that wants to uphold GDPR makes said organization liable for the data processing that takes place under this protocol.” No industry group will want to put itself in this position, leaving businesses to their own devices and making ecommerce data collection much less consistent and much more expensive – even if that data collection is necessary to fulfill the requests of consumers.

For years companies thought that informed consumer consent would be a way to personalize messaging and keep consumer costs low online, but the EU has thrown all online consent regimes into question. EU regulators have effectively decided that people can’t make their own decisions about allowing data to be collected. If TCF – the consent system used by 80% of the European internet and a system designed specifically to meet the demands of the GDPR – is now illegal, then, for a second time in a month, all online consumer commerce is thrown into confusion. Thousands were operating websites with TCF and Google Analytics, believing they were following the letter of the law.  That confidence has been smashed.

We are finally seeing the practical effects of the GDPR beyond its simple utility for fining US tech companies.  Those effects are leading to a closed-border internet around Europe and a costlier, less customizable internet for EU citizens. The EU is clearly harming businesses around the world and making its internet a more cramped place. I have trouble seeing the logic and benefit of these decisions, but the GDPR was written to shake the system, and privacy benefits may emerge.

Copyright © 2022 Womble Bond Dickinson (US) LLP All Rights Reserved.
For more articles about international privacy, visit the NLR Cybersecurity, Media & FCC section.

FBI and DHS Warn of Russian Cyberattacks Against Critical Infrastructure

U.S. officials this week warned government agencies, cybersecurity personnel, and operators of critical infrastructure that Russia might launch cyber-attacks against Ukrainian and U.S. networks at the same time it launches its military offensive against Ukraine.

The FBI and the Department of Homeland Security (DHS) warned law enforcement, military personnel, and operators of critical infrastructure to be vigilant in searching for Russian activity on their networks and to report any suspicious activity, as they are seeing an increase in Russian scanning of U.S. networks. U.S. officials are also seeing increased disinformation and misinformation generated by Russia about Ukraine.

The FBI and DHS urged timely patching of systems and reporting of any Russian activity on networks, so U.S. officials can assess the threat, assist with a response, and prevent further activity.

For more information on cyber incident reporting, click here.

Even though a war may be starting halfway across the world, Russia’s cyber capabilities are global. Russia has the capability to bring us all into its war by attacking U.S. government agencies and companies. We are all an important part of preventing attacks and assisting others from becoming a victim of Russia’s attacks. Closely watch your network for any suspicious activity and report it, no matter how small you think it is.

Copyright © 2022 Robinson & Cole LLP. All rights reserved.

Ransom Demands: To Pay or Not to Pay?

As the threat of ransomware attacks against companies has skyrocketed, so has the burden on companies forced to decide whether to pay cybercriminals a ransom demand. Corporate management increasingly is faced with balancing myriad legal and business factors in making real-time, high-stakes “bet the company” decisions with little or no precedent to follow. In a recent advisory, the U.S. Department of the Treasury (Treasury) has once again discouraged companies from making ransom payments or risk potential sanctions.

OFAC Ransom Advisory

On September 21, 2021, the Treasury’s Office of Foreign Assets Control (OFAC) issued an Advisory that updates and supersedes OFAC’s Advisory on Potential Sanctions Risks for Facilitating Ransomware Payments, issued on October 1, 2020. This updated OFAC Advisory follows on the heels of the Biden Administration’s heightened interest in combating the growing risk and reality of cyber threats that may adversely impact national security and the economy.

According to Federal Bureau of Investigation (FBI) statistics from 2019 to 2020 on ransomware attacks, there was a 21 percent increase in reported ransomware attacks and a 225 percent increase in associated losses. All organizations across all industry sectors in the private and public arenas are potential targets of such attacks. As noted by OFAC, cybercriminals often target particularly vulnerable entities, such as schools and hospitals, among others.

While some cybercriminals are linked to foreign state actors primarily motivated by political interests, many threat actors are simply in it “for the money.” Every day cybercriminals launch ransomware attacks to wreak havoc on vulnerable organizations, disrupting their business operations by encrypting and potentially stealing their data. These cybercriminals often demand ransom payments in the millions of dollars in exchange for a “decryptor” key to unlock encrypted files and/or a “promise” not to use or publish stolen data on the Dark Web.

The recent OFAC Advisory states in no uncertain terms that the “U.S. government strongly discourages all private companies and citizens from paying ransom or extortion demands.” OFAC notes that such ransomware payments could be “used to fund activities adverse to the national security and foreign policy objectives of the United States.” The Advisory further states that ransom payments may perpetuate future cyber-attacks by incentivizing cybercriminals. In addition, OFAC cautions that in exchange for payments to cybercriminals “there is no guarantee that companies will regain access to their data or be free from further attacks.”

The OFAC Advisory also underscores the potential risk of violating sanctions associated with ransom payments by organizations. As a reminder, various U.S. federal laws, including the International Emergency Economic Powers Act and the Trading with the Enemy Act, prohibit U.S. persons or entities from engaging in financial or other transactions with certain blacklisted individuals, organizations or countries – including those listed on OFAC’s Specially Designated Nationals and Blacked Persons List or countries subject to embargoes (such as Cuba, the Crimea region of the Ukraine, North Korea and Syria).

Penalties & Mitigating Factors

If a ransom payment is deemed to have been made to a cybercriminal with a nexus to a blacklisted organization or country, OFAC may impose civil monetary penalties for violations of sanctions based on strict liability, even if a person or organization did not know it was engaging in a prohibited transaction.

However, OFAC will consider various mitigating factors in deciding whether to impose penalties against organizations for sanctioned transactions, including if the organizations adopted enhanced cybersecurity practices to reduce the risk of cyber-attacks, or promptly reported ransomware attacks to law enforcement and regulatory authorities (including the FBI, U.S. Secret Service and/or Treasury’s Office of Cybersecurity and Critical Infrastructure Protection).

“OFAC also will consider a company’s full and ongoing cooperation with law enforcement both during and after a ransomware attack” as a “significant” mitigating factor. In encouraging organizations to self-report ransomware attacks to federal authorities, OFAC notes that information shared with law enforcement may aid in tracking cybercriminals and disrupting or preventing future attacks.

Conclusion

In short, payment of a ransom is not illegal per se, so long as the transaction does not involve a sanctioned party on OFAC’s blacklist. Moreover, the recent ransomware Advisory “is explanatory only and does not have the force of law.” Nonetheless, organizations should consider carefully OFAC’s advice and guidance in deciding whether to pay a ransom demand.

In addition to the OFAC Advisory, management should consider the following:

  • Ability to restore systems from viable (unencrypted) backups

  • Marginal time savings in restoring systems with a decryptor versus backups

  • Preservation of infected systems in order to conduct a forensics investigation

  • Ability to determine whether data was accessed or exfiltrated (stolen)

  • Reputational harm if data is published by the threat actor

  • Likelihood that the organization will be legally required to notify individuals of the attack regardless of whether their data is published on the Dark Web.

Should an organization decide it has no choice other than to make a ransom payment, it should facilitate the transaction through a reputable company that first performs and documents an OFAC sanctions check.

© 2021 Wilson Elser

For more articles about ransomware attacks, visit the NLR Cybersecurity, Media & FCC section.

CCPA Notice of Collection – Are You Collecting Geolocation Data, But Do Not Know It?

Businesses subject to the California Consumer Privacy Act (“CCPA”) are working diligently to comply with the CCPA’s numerous mandates, although final regulatory guidance has yet to be issued. Many of these businesses are learning that AB25, passed in October, requires employees, applicants, and certain other California residents to be provided a notice of collection at least for the next 12 months. These businesses need to think about what must be included in these notices.

Business Insider article explains that iPhones maintain a detailed list of every location the user of the phone frequents, including how long it took to get to that location, and how long the user stayed there. The article provides helpful information about where that information is stored on the phone, how the data can be deleted, and, perhaps more importantly, how to stop the tracking of that information. This information may be important for users, as well as companies that provide iPhones to their employees to use in connection with their work.

AB25 excepted natural persons acting as job applicants, employees, owners, directors, officers, medical staff members, and contractors of a CCPA-covered business from all of the CCPA protections except two: (i) providing them a notice of collection under Cal. Civ. Code Sec. 1798.100(b), and (ii) the right to bring a private civil action against a business in the event of a data breach caused by the business’s failure to maintain reasonable safeguards to protect personal information. The notice of collection must inform these persons as to the categories of personal information collected by the business and how those categories are used.

The CCPA’s definition of personal information includes eleven categories of personal information, one of which is geolocation data. As many businesses think about the categories of personal information they collect from employees, applicants, etc. for this purpose, geolocation may be the last thing that comes to mind. This is especially true for businesses with workforces that come into the office every day, and which do not have a business need to know where their employees are, such as transportation, logistics, and home health care businesses. But, they still may provide their workforce members a company-owned iPhone or other smart device with similar capabilities, although not realizing all of its capabilities or configurations.

As many who have gone through compliance with the General Data Protection Regulations in the European Union, the CCPA and other laws that may come after it in the U.S. will require businesses to think more carefully about the personal information they collect. They likely will find such information is being collected without their knowledge and not at their express direction, and they may have to communicate that collection (and use) to their employees.


Jackson Lewis P.C. © 2019

Will Technology Return Shame to Our Society?

The sex police are out there on the streets
Make sure the pass laws are not broken

Undercover (of the Night)The Rolling Stones

So, now we know that browsing porn in “incognito” mode doesn’t prevent those sites from leaking your dirty data courtesy of the friendly folks at Google and Facebook.  93 per cent of porn sites leak user data to a third party. Of these, Google tracks about 74 per cent of the analyzed porn sites, while Oracle tracks nearly 24 per cent sites and Facebook tracks nearly 10 per cent porn sites.  Yet, despite such stats, 30 per cent of all internet traffic still relates to porn sites.

The hacker who perpetrated the enormous Capital One data beach outed herself by oversharing on GitHub.  Had she been able to keep her trap shut, we’d probably still not know that she was in our wallets.  Did she want to get caught, or was she simply unashamed of having stolen a Queen’s ransom worth of financial data?

Many have lamented that shame (along with irony, truth and proper grammar) is dead.  I disagree.  I think that shame has been on the outward leg of a boomerang trajectory fueled by technology and is accelerating on the return trip to whack us noobs in the back of our unsuspecting heads.

Technology has allowed us to do all sorts of stuff privately that we used to have to muster the gumption to do in public.  Buying Penthouse the old-fashioned way meant you had to brave the drugstore cashier, who could turn out to be a cheerleader at your high school or your Mom’s PTA friend.  Buying the Biggie Bag at Wendy’s meant enduring the disapproving stares of vegans buying salads and diet iced tea.  Let’s not even talk about ED medication or baldness cures.

All your petty vices and vanity purchases can now be indulged in the sanctity of your bedroom.  Or so you thought.  There is no free lunch, naked or otherwise, we are coming to find.  How will society respond?

Country music advises us to dance like no one is watching and to love like we’ll never get hurt. When we are alone, we can act closer to our baser instincts.  This is why privacy is protective of creativity and subversive behaviors, and why in societies without privacy, people’s behavior regresses toward the most socially acceptable responses.  As my partner Ted Claypoole wrote in Privacy in the Age of Big Data,

“We all behave differently when we know we are being watched and listened to, and the resulting change in behavior is simply a loss of freedom – the freedom to behave in a private and comfortable fashion; the freedom to allow the less socially -careful branches of our personalities to flower. Loss of privacy reduces the spectrum of choices we can make about the most important aspects of our lives.

By providing a broader range of choices, and by freeing our choices from immediate review and censure from society, privacy enables us to be creative and to make decisions about ourselves that are outside the mainstream. Privacy grants us the room to be as creative and thought-provoking as we want to be. British scholar and law dean Timothy Macklem succinctly argues that the “isolating shield of privacy enables people to develop and exchange ideas, or to foster and share activities, that the presence or even awareness of other people might stifle. For better and for worse, then, privacy is a sponsor and guardian to the creative and the subversive.”

For the past two decades we have let down our guard, exercising our most subversive and embarrassing expressions of id in what we thought was a private space. Now we see that such privacy was likely an illusion, and we feel as if we’ve been somehow gas lighted into showing our noteworthy bad behavior in the disapproving public square.

Exposure of the Ashley Madison affair-seeking population should have taught us this lesson, but it seems that each generation needs to learn in its own way.

The nerds will, inevitably, figure out how to continue to work and play largely unobserved.  But what of the rest of us?  Will the pincer attack of the advancing surveillance state and the denizens of the Dark Web bring shame back as a countervailing force to govern our behavior?  Will the next decade be marked as the New Puritanism?

Dwight Lyman Moody, a predominant 19th century evangelist, author, and publisher, famously said, “Character is what you are in the dark.”  Through the night vision goggles of technology, more and more of your neighbors can see who you really are and there are very few of us who can bear that kind of scrutiny.  Maybe Mick Jagger had it right all the way back in 1983, when he advised “Curl up baby/Keep it all out of sight.”  Undercover of the night indeed.



Copyright © 2019 Womble Bond Dickinson (US) LLP All Rights Reserved.

Personal Email Management Service Settles FTC Charges over Allegedly Deceptive Statements to Consumers over Its Access and Use of Subscribers’ Email Accounts

This week, the Federal Trade Commission (FTC) entered into a proposed settlement with Unrollme Inc. (“Unrollme”), a free personal email management service that offers to assist consumers in managing the flood of subscription emails in their inboxes. The FTC alleged that Unrollme made certain deceptive statements to consumers, who may have had privacy concerns, to persuade them to grant the company access to their email accounts. (In re Unrolllme Inc., File No 172 3139 (FTC proposed settlement announced Aug. 8, 2019).

This settlement touches many relevant issues, including the delicate nature of online providers’ privacy practices relating to consumer data collection, the importance for consumers to comprehend the extent of data collection when signing up for and consenting to a new online service or app, and the need for downstream recipients of anonymized market data to understand how such data is collected and processed.  (See also our prior post covering an enforcement action involving user geolocation data collected from a mobile weather app).

A quick glance at headlines announcing the settlement might give the impression that the FTC found Unrollme’s entire business model unlawful or deceptive, but that is not the case.  As described below, the settlement involved only a subset of consumers who received allegedly deceptive emails to coax them into granting access to their email accounts.  The model of providing free products or services in exchange for permission to collect user information for data-driven advertising or ancillary market research remains widespread, though could face some changes when California’s CCPA consumer choice options become effective or in the event Congress passes a comprehensive data privacy law.

As part of the Unrollme registration process, users grant Unrollme access to selected personal email accounts for decluttering purposes.  However, this permission also allows Unrollme to access and scan inboxes for so-called “e-receipts” or emailed receipts from e-commerce transactions. After scanning users’ e-receipt data (which might include billing and shipping addresses and information about the purchased products or services), Unrollme’s parent company, Slice Technologies, Inc., would anonymize the data and package it into market research reports that are sold to various companies, retailers and others.  According to the FTC complaint, when some consumers declined to grant permission to their email accounts during signup, Unrollme, during the relevant time period, tried to make them reconsider by sending allegedly deceptive statements about its access (e.g, “You need to authorize us to access your emails. Don’t worry, this is just to watch for those pesky newsletters, we’ll never touch your personal stuff”).  The FTC claimed that such messages did not tell users that access to their inboxes would also be used to collect e-receipts and to package that data for sale to outside companies, and that thousands of consumers changed their minds and signed up for Unrollme.

As part of the settlement, Unrollme is prohibited from misrepresentations about the extent to which it accesses, collects, uses, stores or shares information in connection with its email management products. Unrollme must also send an email to all current users who enrolled in Unrollme after seeing the allegedly deceptive statements and explain Unrollme’s data collection and usage practices.  Unrollme is also required to delete all e-receipt data obtained from recipients who enrolled in Unrollme after seeing the challenged statements (unless Unrollme receives affirmative consent to maintain such data from the affected consumers).

In an effort at increased transparency, Unrollme’s current home page displays several links to detailed explanations of how the service collects and analyzes user data (e.g., “How we use data”).

Interestingly, this is not the first time Unrollme’s practices have been challenged, as the company faced a privacy suit over its data mining practices last year.  (See Cooper v. Slice Technologies, Inc., No. 17-7102 (S.D.N.Y. June 6, 2018) (dismissing a privacy suit that claimed that Unrollme did not adequately disclose to consumers the extent of its data mining practices, and finding that consumers consented to a privacy policy that expressly allowed such data collection to build market research products and services).


© 2019 Proskauer Rose LLP.
This article is by Jeffrey D Neuburger of Proskauer Rose LLP.
For more on data privacy see the National Law Review Communications, Media & Internet law page.

You Can be Anonymised But You Can’t Hide

If you think there is safety in numbers when it comes to the privacy of your personal information, think again. A recent study in Nature Communications found that, given a large enough dataset, anonymised personal information is only an algorithm away from being re-identified.

Anonymised data refers to data that has been stripped of any identifiable information, such as a name or email address. Under many privacy laws, anonymising data allows organisations and public bodies to use and share information without infringing an individual’s privacy, or having to obtain necessary authorisations or consents to do so.

But what happens when that anonymised data is combined with other data sets?

Researchers behind the Nature Communications study found that using only 15 demographic attributes can re-identify 99.98% of Americans in any incomplete dataset. While fascinating for data analysts, individuals may be alarmed to hear that their anonymised data can be re-identified so easily and potentially then accessed or disclosed by others in a way they have not envisaged.

Re-identification techniques were recently used by the New York Times. In March this year, they pulled together various public data sources, including an anonymised dataset from the Internal Revenue Service, in order to reveal a decade’s worth of Donald Trump’s negatively adjusted income tax returns. His tax returns had been the subject of great public speculation.

What does this mean for business? Depending on the circumstances, it could mean that simply removing personal information such as names and email addresses is not enough to anonymise data and may be in breach of many privacy laws.

To address these risks, companies like Google, Uber and Apple use “differential privacy” techniques, which adds “noise” to datasets so that individuals cannot be re-identified, while still allowing access to the information outcomes they need.

It is a surprise for many businesses using data anonymisation as a quick and cost effective way to de-personalise data that more may be needed to protect individuals’ personal information.

If you would like to know more about other similar studies, check out our previous blog post ‘The Co-Existence of Open Data and Privacy in a Digital World’.

Copyright 2019 K & L Gates
This article is by Cameron Abbott of  K&L Gates.
For more on internet privacy, see the National Law Review Communications, Media & Internet law page.

No Means No

Researchers from the International Computer Science Institute found up to 1,325 Android applications (apps) gathering data from devices despite being explicitly denied permission.

The study looked at more than 88,000 apps from the Google Play store, and tracked data transfers post denial of permission. The 1,325 apps used tools, embedded within their code, that take personal data from Wi-Fi connections and metadata stored in photos.

Consent presents itself in different ways in the world of privacy. The GDPR is clear in defining consent as it pertains to user content. Recital 32 notes that “Consent should be given by a clear affirmative act establishing a freely given, specific, informed and unambiguous indication of the data subject’s agreement to the processing of personal data…” Consumers pursuant to the CCPA can opt-out of having their personal data sold.

The specificity of consent has always been a tricky subject.  For decades, companies have offered customers the right to either opt in or out of “marketing,” often in exchange for direct payments. Yet, the promises have been slickly unspecific, so that a consumer never really knows what particular choices are being selected.

Does the option include data collection, if so how much? Does it include email, text, phone, postal contacts for every campaign or just some? The GDPR’s specificity provision is supposed to address this problem. But companies are choosing to not offer these options or ignore the consumer’s choice altogether.

Earlier this decade, General Motors caused a media dust-up by admitting it would continue collecting information about specific drivers and vehicles even if those drivers refused the Onstar system or turned it off. Now that policy is built into the Onstar terms of service. GM owners are left without a choice on privacy, and are bystanders to their driving and geolocation data being collected and used.

Apps can monitor people’s movements, finances, and health information. Because of these privacy risks, app platforms like Google and Apple make strict demands of developers including safe storage and processing of data. Seven years ago, Apple, whose app store has almost 1.8 million apps, issued a statement claiming that “Apps that collect or transmit a user’s contact data without their prior permission are in violation of our guidelines.”

Studies like this remind us mere data subjects that some rules were made to be broken. And even engaging with devices that have become a necessity to us in our daily lives may cause us to share personal information. Even more, simply saying no to data collection does not seem to suffice.

It will be interesting to see over the next couple of years whether tighter option laws like the GDPR and the CCPA can not only cajole app developers to provide specific choices to their customers, and actually honor those choices.

 

Copyright © 2019 Womble Bond Dickinson (US) LLP All Rights Reserved.
For more on internet and data privacy concerns, see the National Law Review Communications, Media & Internet page.

The Digital Revolution Takes on New Meaning: Among Calls for Heightened U.S. Data Privacy Measures, California is King

California’s ambitious new data privacy law, the California Consumer Privacy Act of 2018 (“CCPA”),[1] will go into effect on January 1, 2020, and promises to bring a new era of digital regulation to America’s shores. Financial institutions that just navigated their way through implementing the European Union’s General Data Protection Regulation (“GDPR”),[2] which became effective in May 2018,[3] may be uneasy about the prospect of complying with yet another new data privacy compliance regime. They will find some comfort in the fact that many of the systems and processes designed for GDPR compliance will serve their needs under the CCPA as well. However, between now and the go-live date of the CCPA, U.S. federal and state laws and regulations are likely to continue to evolve and expand, and financial institutions will need to prepare for CCPA implementation while staying abreast of other fast-moving developments. In this article, we provide some key takeaways for how firms can be as prepared as possible for the continuing evolution of U.S. data privacy law.

  1. The New California Data Privacy Law Will Apply Broadly to Financial Institutions with Customers in California

Financial institutions with customers who are California residents almost certainly fit within the types of businesses to which the CCPA will apply. A “business” subject to the CCPA includes for-profit sole proprietorships, partnerships, limited liability companies, corporations, associations, or any other legal entities that collect consumers’ personal information and that satisfy one or more of the following criteria:

  • has annual gross revenues in excess of $25 million;

  • alone or in combination annually buys, receives for the business’ commercial purposes, sells, or shares for commercial purposes, alone or in combination, the personal information of 50,000 or more consumers, households, or devices; or

  • derives 50% or more of its annual revenue from selling consumers’ personal information.[4]

The CCPA also applies to legal entities that control or are controlled by a CCPA-covered business, and where the two legal entities share common branding (such as a shared name, servicemark, or trademark).[5]

For U.S. businesses seeking to remain outside the purview of the CCPA, the available carve-out is extremely narrow. Businesses that collect or sell the personal information of a California resident are exempt from the CCPA only if “every aspect of that commercial conduct takes place wholly outside of California.” This requires that (a) the personal information must have been collected when the consumer was outside of California, (b) no part of the sale of the consumer’s personal information occurred in California, and (c) no personal information collected while the consumer was in California was sold. In practice, this means that any firm with a website or other digital presence visited by California residents will likely be ensnared by the CCPA even if they lack employees or a physical presence in the state.[6]

Businesses that fail to comply with the CCPA are subject to the possibility of a state enforcement action and consumer lawsuits (available only after providing notice to the business and the business fails to cure the violation within 30 days).[7] However, unlike the GDPR which can impose fines calculated as a factor of global revenue, the CCPA assesses penalties of up to $2,500 per violation and up to $7,500 per intentional violation.[8]

  1. California’s Expansive Concept of “Personal Information” Is Similar to the GDPR

When determining what consumer data will constitute personal information under the CCPA, firms can look to certain similarities with the GDPR.

Under the CCPA, “personal information” means “information that identifies, relates to, describes, is capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household.” This includes, but is not limited to, names, addresses, identification number (such as social security, driver’s license, or passport), email address, and Internet Protocol (IP) address. It also includes biometric information, internet activity information (such as web browser or search history, or information regarding a consumer’s interaction with a website), geolocation data, and employment-related or education information.[9] This definition is largely consistent with how the GDPR broadly defines “personal data” for residents of the EU.[10]

The CCPA does not apply to data that has been “deidentified,” which means personal information that cannot reasonably identify, relate to, describe, or be linked to a particular consumer.[11] This is akin to the GDPR’s exclusion for “anonymized” data which cannot be used to identify a data subject. In addition, the CCPA does not apply to “aggregate consumer information,” which is information that relates to a group or category of consumers, from which individual consumer identities have been removed, that is not linked or reasonably linkable to any consumer or household or device.[12]

One difference between the two regimes, however, is that the CCPA’s definition of personal information excludes “publicly available” information, which is information that is lawfully made available from federal, state, or local government records.[13] The GDPR does not have a similar exception and instead provides the same protections to personal data regardless of its source.

  • California Consumers Will Enjoy a New Bill of Rights Protecting their Personal Information

Another similarity between the CCPA and the GDPR is the recognition of several fundamental rights that consumers will soon enjoy relating to the collection, use, and sale of their personal information. Under the CCPA, these can effectively be described as:

  • Right of Disclosure. A business that collects a consumer’s personal information will be required, at or before the point of collection, to inform consumers as to the categories of personal information to be collected and the purposes for which the categories of personal information will be used.[14] A consumer, e., a “natural person who is a California resident,” will also have the right to request such a business disclose to that consumer the categories and specific pieces of personal information the business has collected.[15] Such a request must be complied with promptly, by mail or electronically, and free of charge to the consumer; however, businesses will not be required to provide such information per consumer request more than twice in a 12-month period.[16] Together with this right, consumers will also have the ability to request the business or commercial purpose for collecting or selling personal information, and the categories of third parties with whom the business shares personal information.[17] Finally, consumers will have the right to request that a business that sells the consumer’s personal information, or discloses it for a business purpose, disclose what personal information was collected and the categories of third parties to whom it was sold.[18]

  • Right of Deletion. A consumer will have the right to request that a business delete any personal information about the consumer which the business has collected from the consumer.[19] If a business has received such a request, it will be required not only to delete the consumer’s personal information from its records, but also to direct any service providers to do the same.[20] This obligation to delete personal information at consumer request is subject to several exceptions, including for the completion of a financial transaction, to detect security incidents or debug errors, and to comply with legal obligations.[21]

  • Right to “Opt Out.” A consumer will have the right to direct a business that sells personal information about the consumer to third parties not to sell the consumer’s personal information going forward.[22] Once a business has received such an instruction from a consumer, it may not resume selling that consumer’s personal information unless express authorized to do so.[23] This right of a consumer to “opt out” must be clearly communicated to consumers on a business’ website under a banner titled “Do Not Sell My Personal Information,” with an accompanying link that enables a customer to opt out of the sale of the consumer’s personal information.[24]

  • Right to Non-Discrimination. Businesses will be prohibited from discriminating against consumers who exercise their various rights under the CCPA by denying them goods or services, charging different prices, or providing a different level or quality of goods or services.[25]

  1. Financial Institutions Should Not Expect a Complete Carve-Out Under Federal Law

The CCPA will not apply to personal information that is collected, processed, sold, or disclosed under certain federal laws.[26] One such law is the Gramm-Leach-Bliley Act (“GLBA”),[27] which covers financial institutions that offer consumers financial products, like banks, and contains its own consumer privacy-related protections.[28] However, this is not a complete exception because the CCPA defines personal information far more broadly than the financial-transaction-related data contemplated by the GLBA, and includes such data as browser history and IP address. As a result, firms will need to contemplate what personal information they collect in addition to what is captured under the GLBA and be prepared to protect it accordingly under the CCPA.

  1. Conclusion

California may be the next big word on U.S. data privacy legislation, but it is unlikely to be the last. In recent years, Congress and other states have faced increased pressure to explore new cybersecurity and data privacy legislation due to a multitude of factors including a growing awareness of how businesses collect and use personal information as seen with Cambridge Analytica’s use of Facebook data, and public frustration with companies’ perceived lackluster responses to major customer data breaches.[29] A recent report from the U.S. Government Accountability Office further highlights America’s growing appetite for GDPR-like legislation, calling it an “appropriate time for Congress to consider comprehensive Internet privacy legislation.”[30]  And while the last Congress failed to enact any new national data privacy legislation into law, both the House and Senate have held hearings recently to receive testimony on guiding principles for a potential federal data privacy law, with a key question being whether any such law should preempt state laws like the CCPA.[31] So while a full-blown U.S. equivalent of the GDPR may not yet be in the cards, the current mood among the public and among lawmakers points in the direction of more rather than less intensive data privacy rules to come.

1  SB-1121 California Consumer Privacy Act of 2018 (Sept. 24, 2018), 

2 European Commission, General Data Protection Regulation (Regulation (EU) 2016/679) of the European Parliament.

3  See Joseph Moreno et al., The EU’s New Data Protection Regulation – Are Your Cybersecurity and Data Protection Measures up to Scratch?, Cadwalader, Wickersham & Taft LLP (Mar. 6, 2017), .

4   Cal. Civ. Code § 1798.140(c)(1).

5   § 1798.140(c)(2).

6   § 1798.145(a)(6).

7   § 1798.150(b).

8   § 1798.155(b).

9   § 1798.140(o)(1).

10  Article 4 of the GDPR defines “personal data” as “any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person.”

11  § 1798.140(h).

12  § 1798.140(a).

13  § 1798.140(o)(2). Under the CCPA, personal information loses its “publically available” designation if that data is “used for a purpose that is not compatible with the purpose for which the data is maintained and made available in the government records or for which it is publicly maintained.” Id.

14  § 1798.100(b).

15  § 1798.100(a).

16  § 1798.100(d).

17  § 1798.110(a).

18  § 1798.115(a).

19  § 1798.105(a).

20  § 1798.105(c).

21  § 1798.105(d).

22  § 1798.120(a).

23  § 1798.120(c).

24  § 1798.135(a)(1).

25  § 1798.125(a)(1).

26  § 1798.145(e).

27  15 U.S.C. §§ 6801-6809, 6821-6827.

28  Federal Financial Institutions Examination Council, Gramm-Leach-Bliley Summary of Provisions.

29  See Joseph Moreno, States Respond to Equifax Cyber Breach with Enforcement Actions and Calls for Enhanced Regulatory Powers, Cadwalader, Wickersham & Taft LLP (Oct. 13, 2017).

30  United States Government Accountability Office, Internet Privacy Additional Federal Authority Could Enhance Consumer Protection and Provide Flexibility (Jan. 2019), https://www.gao.gov/assets/700/696437.pdf.

31  U.S. House Committee on Energy & Commerce Subcommittee on Consumer Protection & Commerce, Hearing on “Protecting Consumer Privacy in the Era of Big Data(Feb. 26, 2019), ; U.S. Senate Committee on Commerce, Science, and Transportation, Policy Principles for a Federal Data Privacy Framework in the United States (Feb. 27, 2019), ; Alfred Ng, At Hearing on Federal Data-Privacy Law, Debate Flares Over State Rules, CNET (Feb. 26, 2019), ; Daniel R. Stoller, New FTC Powers Weighed in Senate Data Privacy Hearing (1), Bloomberg Law (Feb. 27, 2019), .

 

© Copyright 2019 Cadwalader, Wickersham & Taft LLP