Judge Approves $92 Million TikTok Settlement

On July 28, 2022, a federal judge approved TikTok’s $92 million class action settlement of various privacy claims made under state and federal law. The agreement will resolve litigation that began in 2019 and involved claims that TikTok, owned by the Chinese company ByteDance, violated the Illinois Biometric Information Privacy Act (“BIPA”) and the federal Video Privacy Protection Act (“VPPA”) by improperly harvesting users’ personal data. U.S. District Court Judge John Lee of the Northern District of Illinois also awarded approximately $29 million in fees to class counsel.

The class action claimants alleged that TikTok violated BIPA by collecting users’ faceprints without their consent and violated the VPPA by disclosing personally identifiable information about the videos people watched. The settlement agreement also provides for several forms of injunctive relief, including:

  • Refraining from collecting and storing biometric information, collecting geolocation data and collecting information from users’ clipboards, unless this is expressly disclosed in TikTok’s privacy policy and done in accordance with all applicable laws;
  • Not transmitting or storing U.S. user data outside of the U.S., unless this is expressly disclosed in TikTok’s privacy policy and done in accordance with all applicable laws;
  • No longer pre-uploading U.S. user generated content, unless this is expressly disclosed in TikTok’s privacy policy and done in accordance with all applicable laws;
  • Deleting all pre-uploaded user generated content from users who did not save or post the content; and
  • Training all employees and contractors on compliance with data privacy laws and company procedures.
Copyright © 2022, Hunton Andrews Kurth LLP. All Rights Reserved.

Throwing Out the Privacy Policy is a Bad Idea

The public internet has been around for about thirty years and consumers’ browser-based graphic-heavy experience has existed for about twenty-five years. In the early days, commercial websites operated without privacy policies.

Eventually, people started to realize that they were leaving trails of information online, and in the early ‘aughts the methods for business capturing and profiting from these trails became clear, although the actual uses of the data on individual sites was not clear. People asked for greater transparency from the sites they visited online, and in response received the privacy policy.

A deeply-flawed instrument, the website privacy policy purports to explain how information is gathered and used by a website owner, but most such policies are strangely both imprecise and too long, losing the average reader in a fog of legalese language and marginally relevant facts. Some privacy policies are intentionally obtuse because it doesn’t profit the website operator to make its methods obvious. Many are overly general, in part because the website company doesn’t want to change its policy every time it shifts business practices or vendor alliances. Many are just messy and poorly written.

Part of the reason that privacy policies are confusing is that data privacy is not a precise concept. The definition of data is context dependent. Data can mean the information about a transaction, information gathered from your browser visit (include where you were before and after the visit), information about you or your equipment, or even information derived by analysis of the other information. And we know that de-identified data can be re-identified in many cases, and that even a collection a generic data can lead to one of many ways to identify a person.

The definition of data is context dependent.

The definition of privacy is also untidy. An ecommerce company must capture certain information to fulfill an online order. In this era of connected objects, the company may continue to take information from the item while the consumer is using it. This is true for equipment from televisions to dishwashers to sex toys. The company likely uses this information internally to develop its products. It may use the data to market more goods or services to the consumer. It may transfer the information to other companies so they can market their products more effectively. The company may provide the information to the government. This week’s New Yorker devotes several pages to how the word “privacy” conflates major concepts in US law, including secrecy and autonomy,1 and is thus confusing to courts and public alike.

All of this is difficult to reflect in a privacy policy, even if the company has incentive to provide useful information to its customers.

Last month the Washington Post ran an article by Geoffrey Fowler that was subtitled “Let’s abolish reading privacy policies.” The article notes a 2019 Pew survey claiming that only 9 percent of Americans say they always read privacy policies. I would suggest that more than half of those Americans are lying. Almost no one always reads privacy policies upon first entering a website or downloading an app. That’s not even really what privacy policies are for.

Fowler shows why people do not read these policies. He writes, “As an experiment, I tallied up all of the privacy policies just for the apps on my phone. It totaled nearly 1 million words. “War and Peace” is about half as long. And that’s just my phone. Back in 2008, Lorrie Cranor, a professor of engineering and public policy at Carnegie Mellon University, and a colleague estimated that reading and consenting to all the privacy policies on websites Americans visit would take 244 hours per year.”

The length, complexity and opacity of online privacy policies are concerning. The best alleviation for this concern would not be to eliminate privacy policies, but to make them less instrumental in the most important decisions about descriptive data.

Limit companies’ use of data and we won’t need to fight through their privacy options.

Website owners should not be expected to write out privacy policies that are both sufficiently detailed and succinctly readable so that consumers can make meaningful choices about use of the data that describes them. This type of system forces a person to be responsible for her own data protection and takes the onus off of the company to limit its use of the data. It is like our current system of waste recycling – both ineffective and supported by polluters, because rather than forcing manufacturers to use more environmentally friendly packaging, it pushes consumers to deal with the problem at home, shifting the burden from industry to us.  Similarly, if the legislatures provided a set of simple rules for website operators – here is what you are allowed to do with personal data, and here is what you are not allowed to do with it – then no one would read privacy policies to make sure data about our transactions was spared the worst treatment. The worst treatment would be illegal.

State laws are moving in this direction, providing simpler rules restricting certain uses and transfers of personal data and sensitive data. We are early in the process, but if the trend continues regarding omnibus state privacy laws in the same manner that all states eventually passed data breach disclosure laws, then we can be optimistic and expect full coverage of online privacy rules for all Americans within a decade or so. But we shouldn’t need to wait for all states to comply.

Unlike the data breach disclosure laws which encourage companies to comply only with the laws relevant to their particular loss of data, omnibus privacy laws affect the way companies conduct the normal course of everyday business, so it will only take requirements in a few states before big companies start building their privacy rights recognition functions around the lowest common denominator. It will simply make economic sense for businesses to give every US customer the same rights as most protective state provides its residents. Why build 50 sets of rules when you don’t need to do so? The cost savings of maintaining only one privacy rights-recognition system will offset the cost of providing privacy rights to people in states who haven’t passed omnibus laws yet.

This won’t make privacy policies any easier to read, but it will become less important to read them. Then privacy policies can return to their core function, providing a record of how a company treats data. In other words, a reference document, rather than a set of choices inset into a pillow of legal terms.

We shouldn’t eliminate the privacy policy. We should reduce the importance of such polices, and limit their functions, reducing customer frustration with the privacy policy’s role in our current process. Limit companies’ use of data and we won’t need to fight through their privacy options.


ENDNOTES

1 Privacy law also conflates these meanings with obscurity in a crowd or in public.


Article By Theodore F. Claypoole of Womble Bond Dickinson (US) LLP

Copyright © 2022 Womble Bond Dickinson (US) LLP All Rights Reserved.

Apple Imposes Privacy Policy Requirement for All Apps Operating on its Platform

As Apple recently reminded developers, starting on October 3, 2018 it will require all apps being submitted for distribution through its app store, or for testing by its TestFlight service, to have a publicly posted privacy policy. This requirement was incorporated into Apple’s App Store Review Guidelines and will apply to all new apps, as well as all updated versions of existing apps. Previously only those apps that collected user information had to have a privacy policy.

Apple’s previous requirements were consistent with a 2012 Joint Statement of Principles agreement that Apple and other app store platforms made with the California Attorney General. In that statement, the platforms agreed to require apps that collect information to conspicuously post a privacy policy telling consumers how their personal data was being collected, used, and shared. To encourage transparency of apps’ privacy practices, the platforms also agreed to allow app developers to link to their privacy policy directly from the store. Finally, the platforms agreed to create ways for consumers to notify them if an app was not living up to its policies, and to respond to such complaints.

The new Guidelines build on the principles established in 2012 and expand the privacy policy requirement to all apps, even utility apps that do not collect user information and apps still in the testing phase. Per the Guidelines, the policy will need to be included in the App Store Connect metadata field and as a link in the app itself. Without the policy, the app will not be reviewed and will not be made available on Apple’s platform.

Under the new Guidelines, an app’s privacy policy must still have a description of what data the app collects, how that data is collected, and how it is used. The policy must also notify users how long the app developer will keep the information it collects and how it will be deleted. The Guidelines also require the policy to inform users how they can revoke their consent (if applicable) for data collection and how to make a request to have their data be deleted. Finally, the policy will have to confirm that the app will follow Apple’s guidelines about sharing information with third parties, and that any third party that the information is sent to will be held to Apple’s data security guidelines. If the app’s privacy policy sets higher standards for data protection than Apple’s guidelines, the third party will have to also meet that benchmark.

Putting it Into Practice: This announcement is a reminder for companies to look at how they are sharing privacy practices with consumers across a variety of platforms, including mobile apps.

 

Copyright © 2018, Sheppard Mullin Richter & Hampton LLP.