U.S. Sues TikTok for Children’s Online Privacy Protection Act (COPPA) Violations

On Friday, August 2, 2024, the United States sued ByteDance, TikTok, and its affiliates for violating the Children’s Online Privacy Protection Act of 1998 (“COPPA”) and the Children’s Online Privacy Protection Rule (“COPPA Rule”). In its complaint, the Department of Justice alleges TikTok collected, stored, and processed vast amounts of data from millions of child users of its popular social media app.

In June, the FTC voted to refer the matter to the DOJ, stating that it had determined there was reason to believe TikTok (f.k.a. Musical.ly, Inc.) had violated a FTC 2019 consent order and that the agency had also uncovered additional potential COPPA and FTC Act violations. The lawsuit filed today in the Central District of California, alleges that TikTok is directed to children under age 13, that Tik Tok has permitted children to evade its age gate, that TikTok has collected data from children without first notifying their parents and obtaining verifiable parental consent, that TikTok has failed to honor parents’ requests to delete their children’s accounts and information, and that TikTok has failed to delete the accounts and information of users the company knows are children. The complaint also alleges that TikTok failed to comply with COPPA even for accounts in the platform’s “Kids Mode” and that TikTok improperly amassed profiles on Kids Mode users. The complaint seeks civil penalties of up to $51,744 per violation per day from January 10, 2024, to present for the improper collection of children’s data, as well as permanent injunctive relief to prevent future violations of the COPPA Rule.

The lawsuit comes on the heels of the U.S. Senate passage this week of the Children and Teens’ Online Privacy Protection Act (COPPA 2.0) and the Kids Online Safety Act (KOSA) by a 91-3 bipartisan vote. It is unknown whether the House will take up the bills when it returns from recess in September.

American Privacy Rights Act Advances with Significant Revisions

On May 23, 2024, the U.S. House Committee on Energy and Commerce Subcommittee on Data, Innovation, and Commerce approved a revised draft of the American Privacy Rights Act (“APRA”), which was released just 36 hours before the markup session. With the subcommittee’s approval, the APRA will now advance to full committee consideration. The revised draft includes several notable changes from the initial discussion draft, including:

  • New Section on COPPA 2.0 – the revised APRA draft includes the Children’s Online Privacy Protection Act (COPPA 2.0) under Title II, which differs to a certain degree from the COPPA 2.0 proposal currently before the Senate (e.g., removal of the revised “actual knowledge” standard; removal of applicability to teens over age 12 and under age 17).
  • New Section on Privacy By Design – the revised APRA draft includes a new dedicated section on privacy by design. This section requires covered entities, service providers and third parties to establish, implement, and maintain reasonable policies, practices and procedures that identify, assess and mitigate privacy risks related to their products and services during the design, development and implementation stages, including risks to covered minors.
  • Expansion of Public Research Permitted Purpose – as an exception to the general data minimization obligation, the revised APRA draft adds another permissible purpose for processing data for public or peer-reviewed scientific, historical, or statistical research projects. These research projects must be in the public interest and comply with all relevant laws and regulations. If the research involves transferring sensitive covered data, the revised APRA draft requires the affirmative express consent of the affected individuals.
  • Expanded Obligations for Data Brokers – the revised APRA draft expands obligations for data brokers by requiring them to include a mechanism for individuals to submit a “Delete My Data” request. This mechanism, similar to the California Delete Act, requires data brokers to delete all covered data related to an individual that they did not collect directly from that individual, if the individual so requests.
  • Changes to Algorithmic Impact Assessments – while the initial APRA draft required large data holders to conduct and report a covered algorithmic impact assessment to the FTC if they used a covered algorithm posing a consequential risk of harm to individuals, the revised APRA requires such impact assessments for covered algorithms to make a “consequential decision.” The revised draft also allows large data holders to use certified independent auditors to conduct the impact assessments, directs the reporting mechanism to NIST instead of the FTC, and expands requirements related to algorithm design evaluations.
  • Consequential Decision Opt-Out – while the initial APRA draft allowed individuals to invoke an opt-out right against covered entities’ use of a covered algorithm making or facilitating a consequential decision, the revised draft now also allows individuals to request that consequential decisions be made by a human.
  • New and/or Revised Definitions – the revised APRA draft’s definition section includes new terms, such as “contextual advertising” and “first party advertising.”. The revised APRA draft also redefines certain terms, including “covered algorithm,” “sensitive covered data,” “small business” and “targeted advertising.”

FTC Denies AgeCheq Parental Consent Application But Trumpets General Support for COPPA Common Consent Mechanisms

Covington BUrling Law Firm

The Federal Trade Commission (“FTC”) recently reiterated its support for the use of “common consent” mechanisms that permit multiple operators to use a single system for providing notices and obtaining verifiable consent under the Children’s Online Privacy Protection Act (“COPPA”). COPPA generally requires operators of websites or online services that are directed to children under 13 or that have actual knowledge that they are collecting personal information from children under 13 to provide notice and obtain verifiable parental consent before collecting, using, or disclosing personal information from children under 13.   The FTC’s regulations implementing COPPA (the “COPPA Rule”) do not explicitly address common consent mechanisms, but in the Statement of Basis and Purpose accompanying 2013 revisions to the COPPA Rule, the FTC stated that “nothing forecloses operators from using a common consent mechanism as long as it meets the Rule’s basic notice and consent requirements.”

The FTC’s latest endorsement of common consent mechanisms appeared in a letter explaining why the FTC was denying AgeCheq, Inc.’s application for approval of a common consent method.  The COPPA Rule establishes a voluntary process whereby companies may submit a formal application to have new methods of parental consent considered by the FTC.  The FTC denied AgeCheq’s application because it “incorporates methods already enumerated” in the COPPA Rule: (1) a financial transaction, and (2) a print-and-send form.   The implementation of these approved methods of consent in a common consent mechanism was not enough to merit a separate approval from the FTC .  According to the FTC, the COPPA Rule’s new consent approval process was intended to vet new methods of obtaining verifiable parental consent rather than specificimplementations of approved methods.  While AgeCheq’s application was technically “denied,” the FTC emphasized that AgeCheq and other “[c]ompanies are free to develop common consent mechanisms without applying to the Commission for approval.”  In support of common consent mechanisms, the FTC quoted language from the 2013 Statement of Basis and Purpose and pointed out that at least one COPPA Safe Harbor program already relies on a common consent mechanism.

OF

Federal Trade Commission (FTC) Recommends Privacy Practices for Mobile Apps

The National Law Review recently published an article, Federal Trade Commission (FTC) Recommends Privacy Practices for Mobile Apps, written by Daniel F. GottliebRandall J. Ortman, and Heather Egan Sussman with McDermott Will & Emery:

McDermottLogo_2c_rgb

On February 1, 2013, the Federal Trade Commission (FTC) released a report entitled “Mobile Privacy Disclosures: Building Trust Through Transparency” (Report), which urges mobile device application (app) platforms and developers to improve the privacy policies for their apps to better inform consumers about their privacy practices.  This report follows other recent publications from the FTC concerning mobile apps—including “Mobile Apps for Kids: Disclosures Still Not Making the Grade,” released December 2012 (December 2012 Report), and “Mobile Apps for Kids: Current Privacy Disclosures are Disappointing,” released February 2012 (February 2012 Report)—and the adoption of the amended Children’s Online Privacy Protection Act (COPPA) Rule on December 19, 2012.  (See “FTC Updates Rule for Children’s Online Privacy Protection” for more information regarding the recent COPPA amendments.

Among other things, the Report offers recommendations to key stakeholders in the mobile device application marketplace, particularly operating system providers (e.g., Apple and Microsoft), application developers, advertising networks and related trade associations.  Such recommendations reflect the FTC’s enforcement and policy experience with mobile applications and public comment on the matter; however, where the Report goes beyond existing legal requirements, “it is not intended to serve as a template for law enforcement actions or regulations under laws currently enforced by the FTC.”  Nevertheless, such key stakeholders should take the FTC’s recommendations into account when determining how they will collect, use and transfer personal information about consumers and preparing privacy policies to describe their information practices because they reflect the FTC’s expectations under its consumer protection authorities.

At a minimum, operating system providers and application developers should review their existing privacy policies and make revisions, as necessary, to comply with the recommendations included within the Report.  However, all key stakeholders should consider the implications of recommendations specific to their industry segment, as summarized below.

Operating System Providers

Characterized within the Report as “gatekeepers to the app marketplace,” the FTC states that operating system providers have the “greatest ability to effectuate change with respect to improving mobile privacy disclosures.”  Operating system providers, which create and maintain the platform upon which mobile apps run, promulgate rules that app developers must follow in order to access the platform and facilitate interactions between developers and consumers.  Given their prominent role within the app marketplace, it is not surprising that the FTC directs numerous recommendations toward operating system providers, including:

  • Just-In-Time Disclosures.  The Report urges operating system providers to display just-in-time disclosures to consumers and obtain express, opt-in (rather than implied) consent before allowing apps to access sensitive information like geolocation (i.e., the real world physical location of a mobile device), and other information that consumers may find sensitive, such as contacts, photos, calendar entries or recorded audio or video.  Thus, operating system providers and mobile app developers should carefully consider the types of personal information practices that require an opt-in rather than mere use of the app to evidence consent.
  • Privacy Dashboard.  The Report suggests that operating system providers should consider developing a privacy “dashboard” that would centralize privacy settings for various apps to allow consumers to easily review the types of information accessed by the apps they have downloaded.  The “dashboard” model would enable consumers to determine which apps have access to different types of information about the consumer or the consumer’s device and to revisit the choices they initially made about the apps.
  • Icons.  The Report notes that operating system providers currently use status icons for a variety of purposes, such as indicating when an app is accessing geolocation information.  The FTC suggests expansion of this practice to provide an icon that would indicate the transmission of personal information or other information more broadly.
  • Best Practices.  The Report recommends that operating system providers establish best practices for app developers.  For example, operating system providers can compel app developers to make privacy disclosures to consumers by restricting access to their platforms.
  • Review of Apps.  The Report suggests that operating system providers should also make clear disclosures to consumers about the extent to which they review apps developed for their platforms.  Such disclosures may include conditions for making apps available within the platform’s app marketplace and efforts to ensure continued compliance.
  • Do Not Track Mechanism.  The Report directs operating system providers to consider offering a “Do Not Track” (DNT) mechanism, which would provide consumers with the option to prevent tracking by advertising networks or other third parties as they use apps on their mobile devices.  This approach allows consumers to make a single election, rather than case-by-case decisions for each app.

App Developers

Although some practices may be imposed upon app developers by operating system providers, as discussed above, app developers can take several steps to adopt the FTC’s recommendations, including:

  • Privacy Policies.  The FTC encourages all app developers to have a privacy policy, and to include reference to such policy when submitting apps to an operating system provider.
  • Just-In-Time Disclosures.  As with the recommendations for operating system providers, the Report suggests that app developers provide just-in-time disclosures and obtain affirmative express consent before collecting and sharing sensitive information.
  • Coordination with Advertising Networks.  The FTC argues for improved coordination and communication between app developers and advertising networks and other third parties that provide certain functions, such as data analytics, to ensure app developers have an adequate understanding of the software they are incorporating into their apps and can accurately describe such software to consumers.
  • Participation in Trade Associations.  The Report urges app developers to participate in trade associations and other industry organizations, particularly in the development of self-regulatory programs addressing privacy in mobile apps.

Advertising Networks and Other Third Parties

By specifically including advertising networks and other third parties in the Report, the FTC recognizes that cooperation with such networks and parties is necessary to achieve the recommendations outlined for operating system providers and app developers.  The recommendations for advertising networks and other third parties include:

  • Coordination with App Developers.  The Report calls upon advertising networks and other third parties to communicate with app developers to enable such developers to provide accurate disclosures to consumers.
  • DNT Mechanism.  Consistent with its recommendations for operating system providers, the FTC suggests that advertising networks and other third parties work with operating system providers to implement a DNT mechanism.

Trade Associations

The FTC states that trade associations can facilitate standardized privacy disclosures.  The Report makes the following recommendations for trade associations:

  • Icons.  Trade associations can work with operating system providers to develop standardized icons to indicate the transmission of personal information and other data.
  • Badges.  Similar to icons, the Report suggests that trade associations consider developing “badges” or other visual cues used to convey information about a particular app’s data practices.
  • Privacy Policies.  Finally, the FTC suggests that trade associations are uniquely positioned to explore other opportunities to standardize privacy policies across the mobile app industry.

Children and Mobile Apps

Commenting on progress between the February 2012 Report and December 2012 Report, both of which relied on a survey of 400 mobile apps targeted at children, the FTC stated that “little or no progress has been made” in increasing transparency in the mobile app industry with regard to privacy practices specific to children.  The December 2012 Report suggests that very few mobile apps targeted to children include basic information about the app’s privacy practices and interactive features, including the type of data collected, the purpose of the collection and whether third parties have access to such data:

  • Privacy Disclosures.  According to the December 2012 Report, approximately 20 percent of the mobile apps reviewed disclosed any privacy-related information prior to the download process and the same proportion provided access to a privacy disclosure after downloading the app.  Among those mobile apps, the December 2012 Report characterizes their disclosures as lengthy, difficult to read or lacking basic detail, such as the specific types of information collected.
  • Information Collection and Sharing Practices.  The December 2012 Report notes that 59 percent of the mobile apps transmitted some information to the app developer or to a third party.  Unique device identifiers were the most frequently transmitted data point, which the December 2012 Report cites as problematic, suggesting that such identifiers are routinely used to create user “profiles,” which may track consumers across multiple mobile apps.
  • Disclosure Practices Regarding Interactive App Features.  The FTC reports that nearly half of the apps that stated they did not include advertising actually contained advertising, including ads targeted to a mature audience.  Similarly, the December 2012 Report notes that approximately 9 percent of the mobile apps reviewed disclosed that they linked with social media applications; however, this number represented only half of the mobile apps that actually linked to social media applications.  Mobile app developers using a template privacy policy as a starting point for an app’s privacy policy should carefully tailor the template to reflect the developer’s actual privacy practices for the app.

Increased Enforcement

In addition to the reports discussed above and the revisions to the COPPA Rule, effective July 1, 2013, the FTC has also increased enforcement efforts relating to mobile app privacy.  On February 1, 2013, the FTC announced an agreement with Path Inc., operator of the Path social networking mobile app, to settle allegations that it deceived consumers by collecting personal information from their mobile device address books without their knowledge or consent.  Under the terms of the agreement, Path Inc. must establish a comprehensive privacy program, obtain independent privacy assessments every other year for the next 20 years and pay $800,000 in civil penalties specifically relating to alleged violations of the COPPA Rule.  In announcing the agreement, the FTC commented on its commitment to continued scrutiny of privacy practices within the mobile app industry, adding that “no matter what new technologies emerge, the [FTC] will continue to safeguard the privacy of Americans.”

Key Takeaways

App developers and other key stakeholders should consider the following next steps:

  • Review existing privacy policies to confirm they accurately describe current privacy practices for the particular app rather than merely following the developer’s preferred template privacy policy
  • Where practical, update actual privacy practices and privacy policies to be more in line with the FTC’s expectations for transparency and consumer choice, including use of opt-in rather than opt-out consent models
  • Revisit privacy practices in light of heightened FTC enforcement under COPPA and its other consumer protection authorities

© 2013 McDermott Will & Emery

FTC Proposes New Rules on Children’s Online Privacy Issues

Michelle Cohen of Ifrah Law recently had an article regarding Children’s Online Privacy published in The National Law Review:

On August 1, 2012, the Federal Trade Commission announced that is issuing a Supplemental Notice of Proposed Rulemaking to modify certain of its rules under the Children’s Online Privacy Protection Act (COPPA). Industry has been waiting on FTC action regarding COPPA, as the agency previously undertook a COPPA rulemaking in September 2011 and proposed modifying certain COPPA rules to account for changes in technology, particularly mobile technology.

The FTC received over 350 comments during that time. After reviewing those comments, the FTC has decided to propose certain additional changes to its COPPA rule definitions.

In summary, COPPA gives parents control over the information websites can collect from their kids. It applies to websites designed for children under 13 – or those that have reason to know they are collecting information from a child. It requires a specific privacy notice and that consent be obtained from parents in many circumstances before children’s information may be collected and/or used.

The FTC has proposed several changes that are of interest. Some are meant to “tighten” the COPPA rule, others are meant to provide some additional flexibility to operators.

  • The proposed change would make clear that an operator that chooses to integrate the services of third parties that collect personal information from visitors (like ad networks or plug-ins) would itself be considered a covered “operator” under the Rule.
  • The FTC is also proposing to allow websites with mixed audiences (e.g., parents and over 13) to age-screen visitors to provide COPPA’s protections only to those under 13. However, kid-directed sites or services that knowingly target under-13s as their primary audience or whose overall content is likely to attract kids under that age could not use that method.
  • Also, the FTC has proposed modifying the definition of what constitutes “personal information” relating to children to make it clear that a persistent identifier falls within that definition if it can be used to recognize a user over time or across different sites or services. The FTC is considering whether activities like site maintenance and analysis, use of persistent identifiers for authenticating users, maintaining user preferences, serving contextual ads, and protecting against fraud and theft should not be considered the collection of “personal information” as long what’s collected is not used or disclosed to contact a specific individual, including through the use of behaviorally-targeted advertising.

Comments on the FTC’s proposed rule changes are due by September 10, 2012.

© 2012 Ifrah PLLC