WW International to Pay $1.5 Million Civil Penalty for Alleged COPPA Violations

In 2014, with childhood obesity on the rise in the United States, tech company Kurbo, Ltd. (Kurbo) marketed a free app for kids that, according to the company, was “designed to help kids and teens ages 8-17 reach a healthier weight.” When WW International (WW) (formerly Weight Watchers) acquired Kurbo in 2018, the app was rebranded “Kurbo by WW,” and WW continued to market the app to children as young as eight. But according to the Federal Trade Commission (FTC), Kurbo’s privacy practices were not exactly child-friendly, even if its app was. The FTC’s complaint, filed by the Department of Justice (DOJ) last month, claims that WW’s notice, data collection, and data retention practices violated the Children’s Online Privacy Protection Act Rule (COPPA Rule). WW and Kurbo, under a stipulated order, agreed to pay a $1.5 million civil penalty in addition to complying with a range of injunctive provisions. These provisions include, but are not limited to, deleting all personal information of children whose parents did not provide verifiable parental consent in a specified timeframe, and deleting “Affected Work Product” (defined in the order to include any models or algorithms developed in whole or in part using children’s personal information collected through the Kurbo Program).

Complaint Background

The COPPA Rule applies to any operator of a commercial website or online service directed to children that collects, uses, and/or discloses personal information from children and to any operator of a commercial website or online service that has actual knowledge that it collects, uses, and/or discloses personal information from children. Operators must notify parents and obtain their consent before collecting, using, or disclosing personal information from children under 13.

The complaint states that children enrolled in the Kurbo app by signing up through the app or having a parent do it on their behalf. Once on Kurbo, users could enter personal information such as height, weight, and age, and the app then tracked their weight, food consumption, and exercise. However, the FTC alleges that Kurbo’s age gate was porous, requiring no verification process to establish that children who affirmed they were over 13 were the age they claimed to be or that users asserting they were parents were indeed parents. In fact, the complaint alleges that the registration area featured a “tip-off” screen that gave visitors just two choices for registration: the “I’m a parent” option or the “I’m at least 13” option. Visitors saw the legend, “Per U.S. law, a child under 13 must sign up through a parent” on the registration page featuring these choices. In fact, thousands of users who indicated that they were at least 13 were younger and were able to change their information and falsify their real age. Users who lied about their age or who falsely claimed to be parents were able to continue to use the app. In 2020, after a warning from the FTC, Kurbo implemented a registration screen that removed the legend and the “at least 13” option. However, the new process failed to provide verification measures to establish that users claiming to be parents were indeed parents.

Kurbo’s notice of data collection and data retention practices also fell short. The COPPA Rule requires an operator to “post a prominent and clearly labeled link to an online notice of its information practices with regard to children on the home or landing page or screen of its Web site or online service, and, at each area of the Web site or online service where personal information is collected from children.” But beginning in November 2019, Kurbo’s notice at registration was buried in a list of hyperlinks that parents were not required to click through, and the notice failed to list all the categories of information the app collected from children. Further, Kurbo did not comply with the COPPA Rule’s mandate to keep children’s personal information only as long as reasonably necessary for the purpose it was collected and then to delete it. Instead, the company held on to personal information indefinitely unless parents specifically requested its removal.

Stipulated Order

In addition to imposing a $1.5 million civil penalty, the order, which was approved by the court on March 3, 2022, requires WW and Kurbo to:

  • Refrain from disclosing, using, or benefitting from children’s personal information collected in violation of the COPPA Rule;
  • Delete all personal information Kurbo collected in violation of the COPPA Rule within 30 days;
  • Provide a written statement to the FTC that details Kurbo’s process for providing notice and seeking verifiable parental consent;
  • Destroy all affected work product derived from improperly collecting children’s personal information and confirm to the FTC that deletion has been carried out;
  • Delete all children’s personal information collected within one year of the user’s last activity on the app; and
  • Create and follow a retention schedule that states the purpose for which children’s personal information is collected, the specific business need for retaining such information, and criteria for deletion, including a set timeframe no longer than one year.

Implications of the Order

Following the U.S. Supreme Court’s decision in AMG Capital Management, LLC v. Federal Trade Commission, which halted the FTC’s ability to use its Section 13(b) authority to seek monetary penalties for violations of the FTC Act, the FTC has been pushing Congress to grant it greater enforcement powers. In the meantime, the FTC has used other enforcement tools, including the recent resurrection of the agency’s long-dormant Penalty Offense Authority under Section 5(m)(1)(B) of the FTC Act and a renewed willingness to use algorithmic disgorgement (which the FTC first applied in the 2019 Cambridge Analytica case).

Algorithmic disgorgement involves “requir[ing] violators to disgorge not only the ill-gotten data, but also the benefits—here, the algorithms—generated from that data,” as then-Acting FTC Chair Rebecca Kelly Slaughter stated in a speech last year. This order appears to be the first time algorithmic disgorgement was applied by the Commission in an enforcement action under COPPA.

Children’s privacy issues continue to attract the attention of the FTC and lawmakers at both federal and state levels. Companies that collect children’s personal information should be careful to ensure that their privacy policies and practices fully conform to the COPPA Rule.

© 2022 Keller and Heckman LLP

Recent COPPA Settlements Offer Compliance Reminders

The recently announced FTC settlement with YouTube and its parent company, as well as the 2018 settlement between the New York Office of the Attorney General and Oath, have set a new bar when it comes to COPPA compliance.

The settlements offer numerous takeaways, including reminders to those that use persistent identifiers to track children online and deliver them targeted ads.  These takeaways include, but are not limited to the following.

FTC CID attorney Joseph Simons stated that “YouTube touted its popularity with children to prospective corporate clients … yet when it came to complying with COPPA, the company refused to acknowledge that portions of its platform were clearly directed to kids.”

First, under COPPA, a child-directed website or online service – or a site that has actual knowledge it’s collecting or maintaining personal information from a child – must give clear notice on its site of “what information it collects from children, how it uses such information and its disclosure practices for such information.”

Second, the website or service must give direct notice to parents of their practices “with regard to the collection, use, or disclosure of personal information from children.”

Third, prior to collecting personal information from children under 13, COPPA-covered companies must get verifiable parental consent.

COPPA’s definition of “personal information” specifically includes persistent identifiers used for behavioral advertising.  It is critical to note that third-party platforms are subject to COPPA when they have actual knowledge they are collecting personal information from users of a child-directed website.

In March 2019, the FTC handed down what, then, was the largest civil penalty ever for violations of COPPA following allegations that Musical.ly knew many of its users were children and still failed to seek parental consent.  There, the FTC charged that Musical.ly failed to provide notice on their website of the information they collect online from children, how they use it and their disclosure practices; failed to provide direct notice to parents; failed to obtain consent from parents before collecting personal information from children; failed to honor parents’ requests to delete personal information collected from children; and retained personal information for longer than reasonably necessary.

Content creators must know COPPA’s requirements.

If a platform hosting third-party content knows that content is directed to children, it is unlawful to collect personal information from viewers without getting verifiable parental consent.

While it may be fine for most commercial websites geared to a general audience to include a corner for children, it that portion of the website collects information from users, COPPA obligations are triggered.

Comprehensive COPPA policies and procedures to protect children’s privacy are a good idea.  As are competent oversight, COPPA training for relevant personnel, the identification of risks that could result in violations of COPPA, the design and implementation of reasonable controls to address the identified risks, the regular monitoring of the effectiveness of those controls, and the development and use of reasonable steps to select and retain service providers that can comply with COPPA.

The FTC and the New York Attorney General are serious about COPPA enforcement.  Companies should exercise caution with respect to such data collection practices.



© 2019 Hinch Newman LLP

FTC Settlement with Video Social Networking App Largest Civil Penalty in a Children’s Privacy Case

The Federal Trade Commission (FTC) announced a settlement with Musical.ly, a Cayman Islands corporation with its principal place of business in Shanghai, China, resolving allegations that the defendants violated the Children’s Online Privacy Protection Act (COPPA) Rule.

Musical.ly operates a video social networking app with 200 million users worldwide and 65 million in the United States. The app provides a platform for users to create short videos of themselves or others lip-syncing to music and share those videos with other users. The app also provides a platform for users to connect and interact with other users, and until October 2016 had a feature that allowed a user to tap on the “my city” tab and receive a list of other users within a 50-mile radius.

According to the complaint the defendants (1) were aware that a significant percentage of users were younger than 13 years of age and (2) had received thousands of complaints from parents that their children under 13 had created Muscial.ly accounts.

The FTC’s COPPA Rule prohibits the unauthorized or unnecessary collection of children’s personal information online by internet website operators and online services, and requires that verifiable parental consent be obtained prior to the collecting, using, and/or disclosing personal information of children under the age of 13.

In addition to requiring the payment of the largest civil penalty ever imposed for a COPPA case ($5.7 million), the consent decree prohibits the defendants from violating the COPPA Rule and requires that they delete and destroy all of the personal information of children in their possession, custody, or control unless verifiable parental consent has been obtained.

FTC Commissioners Chopra and Slaughter issued a joint statement noting their belief that the FTC should prioritize uncovering the role of corporate officers and directors and hold accountable everyone who broke the law.

 

©2019 Drinker Biddle & Reath LLP. All Rights Reserved

Happy Holidays: VTech Data Breach Affects Over 11 million Parents and Children Worldwide

The recent data breach of Hong Kong-based electronic toy manufacturer VTech Holdings Limited (“VTech” or the “Company”) is making headlines around the world for good reason: it exposed sensitive personal information of over 11 million parents and children users of VTech’s Learning Lodge app store, Kid Connect network, and PlanetVTech in 16 countries! VTech’s Learning Lodge website allows customers to download apps, games, e-books and other educational content to their VTech products, the Kid Connect network allows parents using a smartphone app to chat with their children using a VTech tablet, and PlanetVTech is an online gaming site. As of December 3rd, VTech has suspended all its Learning Lodge sites, the KidConnect network and thirteen other websites pending investigation.

VTech announced the cyberattack on November 27th by press release and has since issued follow-on press releases on November 30th and December 3rd, noting that “the Learning Lodge, Kid Connect and PlanetVTech databases have been attacked by a skilled hacker” and that the Company is “deeply shocked by this orchestrated and sophisticated attack.” According to the various press releases, upon learning of the cyber attack, VTech “conducted a comprehensive check of the affected site” and has “taken thorough actions against future attacks.” The Company has reported that it is currently working with FireEye’s Mandiant Incident Response services and with law enforcement worldwide to investigate the attack. According to VTech’s latest update on the incident:

  • 4, 854, 209 parent Learning Lodge accounts containing the following information were affected: name, email address, secret question and answer for password retrieval, IP address, mailing address, download history and encrypted passwords;

  • 6,368,509 children profile containing the following information were affected: name, gender, and birthdate were affected. 1.2 million of the affected profiles have enabled the Kid Connect App, meaning that the hackers could also have access to profile photos and undelivered Kid Connect chat messages;

  • The compromised databases also include encrypted Learning Lodge content (bulletin board postings, ebooks, apps, games etc.), sales report logs and progress logs to track games, but, it did not include credit card, debit card or other financial account information or Social Security numbers, driver’s license numbers, or ID card numbers; and

  • The affected individuals are located in the following countries: USA, Canada, United Kingdom, Republic of Ireland, France, Germany, Spain, Belgium, the Netherlands, Denmark, Luxembourg, Latin America, Hong Kong, China, Australia and New Zealand. The largest number of affected individuals are reported in the U.S. (2,212,863 parent accounts and 2,894,091 children profiles), France (868,650 parent accounts and 1,173,497 children profiles), the UK (560,487 parent accounts and 727,155 children profiles), and Germany (390,985 parent accounts and 508,806 children profiles).

Given the magnitude and wide territorial reach of the VTech cyber attack, the incident is already on the radar of regulators in Hong Kong and at least two attorneys general in the United States. On December 1, the Hong Kong Office of the Privacy Commissioner for Personal Data announced that it has initiated “a compliance check on the data leakage incident” of VTech Learning Lodge.  In addition, on December 3rd, two separate class actions have already been filed against VTech  Electronics North America, L.L.C. and VTech Holdings Limited in the Northern District of Illinois.  Since the data breach compromised personal information of children located in the United States (first and last name, photographs, online contact information, etc.), it is likely that the Federal Trade Commission (FTC) will investigate VTech’s compliance with the Children’s Online Privacy Protection Act (“COPPA”) and its implementing rule (as amended, the “COPPA Rule”). If a COPPA violation is found, the civil penalties can be steep and go up to $16,000 per violation. In addition to civil penalties imposed by a court, the FTC can require an entity to implement a comprehensive privacy program and to obtain regular, independent privacy assessments for a period of time.

©1994-2015 Mintz, Levin, Cohn, Ferris, Glovsky and Popeo, P.C. All Rights Reserved.

FTC Releases Extensive Report on the “Internet of Things”

Mcdermott Will Emery Law Firm

On January 27, 2015, U.S. Federal Trade Commission (FTC) staff released an extensive report on the “Internet of Things” (IoT). The report, based in part on input the FTC received at its November 2013 workshop on the subject, discusses the benefits and risks of IoT products to consumers and offers best practices for IoT manufacturers to integrate the principles of security, data minimization, notice and choice into the development of IoT devices. While the FTC staff’s report does not call for IoT specific legislation at this time, given the rapidly evolving nature of the technology, it reiterates the FTC’s earlier recommendation to Congress to enact strong federal data security and breach notification legislation.

The report also describes the tools the FTC will use to ensure that IoT manufacturers consider privacy and security issues as they develop new devices. These tools include:

  • Enforcement actions under such laws as the FTC Act, the Fair Credit Reporting Act (FCRA) and the Children’s Online Privacy Protection Act (COPPA), as applicable;

  • Developing consumer and business education materials in the IoT area;

  • Participation in multi-stakeholder groups considering guidelines related to IoT; and

  • Advocacy to other agencies, state legislatures and courts to promote protections in this area.

In furtherance of its initiative to provide educational materials on IoT for businesses, the FTC also announced the publication of “Careful Connections: Building Security in the Internet of Things”.  This site provides a wealth of advice and resources for businesses on how they can go about meeting the concept of “security by design” and consider issues of security at every stage of the product development lifecycle for internet-connected devices and things.

This week’s report is one more sign pointing toward our prediction regarding the FTC’s increased activity in the IoT space in 2015.

FTC Denies AgeCheq Parental Consent Application But Trumpets General Support for COPPA Common Consent Mechanisms

Covington BUrling Law Firm

The Federal Trade Commission (“FTC”) recently reiterated its support for the use of “common consent” mechanisms that permit multiple operators to use a single system for providing notices and obtaining verifiable consent under the Children’s Online Privacy Protection Act (“COPPA”). COPPA generally requires operators of websites or online services that are directed to children under 13 or that have actual knowledge that they are collecting personal information from children under 13 to provide notice and obtain verifiable parental consent before collecting, using, or disclosing personal information from children under 13.   The FTC’s regulations implementing COPPA (the “COPPA Rule”) do not explicitly address common consent mechanisms, but in the Statement of Basis and Purpose accompanying 2013 revisions to the COPPA Rule, the FTC stated that “nothing forecloses operators from using a common consent mechanism as long as it meets the Rule’s basic notice and consent requirements.”

The FTC’s latest endorsement of common consent mechanisms appeared in a letter explaining why the FTC was denying AgeCheq, Inc.’s application for approval of a common consent method.  The COPPA Rule establishes a voluntary process whereby companies may submit a formal application to have new methods of parental consent considered by the FTC.  The FTC denied AgeCheq’s application because it “incorporates methods already enumerated” in the COPPA Rule: (1) a financial transaction, and (2) a print-and-send form.   The implementation of these approved methods of consent in a common consent mechanism was not enough to merit a separate approval from the FTC .  According to the FTC, the COPPA Rule’s new consent approval process was intended to vet new methods of obtaining verifiable parental consent rather than specificimplementations of approved methods.  While AgeCheq’s application was technically “denied,” the FTC emphasized that AgeCheq and other “[c]ompanies are free to develop common consent mechanisms without applying to the Commission for approval.”  In support of common consent mechanisms, the FTC quoted language from the 2013 Statement of Basis and Purpose and pointed out that at least one COPPA Safe Harbor program already relies on a common consent mechanism.

OF

Federal Trade Commission (FTC) Recommends Privacy Practices for Mobile Apps

The National Law Review recently published an article, Federal Trade Commission (FTC) Recommends Privacy Practices for Mobile Apps, written by Daniel F. GottliebRandall J. Ortman, and Heather Egan Sussman with McDermott Will & Emery:

McDermottLogo_2c_rgb

On February 1, 2013, the Federal Trade Commission (FTC) released a report entitled “Mobile Privacy Disclosures: Building Trust Through Transparency” (Report), which urges mobile device application (app) platforms and developers to improve the privacy policies for their apps to better inform consumers about their privacy practices.  This report follows other recent publications from the FTC concerning mobile apps—including “Mobile Apps for Kids: Disclosures Still Not Making the Grade,” released December 2012 (December 2012 Report), and “Mobile Apps for Kids: Current Privacy Disclosures are Disappointing,” released February 2012 (February 2012 Report)—and the adoption of the amended Children’s Online Privacy Protection Act (COPPA) Rule on December 19, 2012.  (See “FTC Updates Rule for Children’s Online Privacy Protection” for more information regarding the recent COPPA amendments.

Among other things, the Report offers recommendations to key stakeholders in the mobile device application marketplace, particularly operating system providers (e.g., Apple and Microsoft), application developers, advertising networks and related trade associations.  Such recommendations reflect the FTC’s enforcement and policy experience with mobile applications and public comment on the matter; however, where the Report goes beyond existing legal requirements, “it is not intended to serve as a template for law enforcement actions or regulations under laws currently enforced by the FTC.”  Nevertheless, such key stakeholders should take the FTC’s recommendations into account when determining how they will collect, use and transfer personal information about consumers and preparing privacy policies to describe their information practices because they reflect the FTC’s expectations under its consumer protection authorities.

At a minimum, operating system providers and application developers should review their existing privacy policies and make revisions, as necessary, to comply with the recommendations included within the Report.  However, all key stakeholders should consider the implications of recommendations specific to their industry segment, as summarized below.

Operating System Providers

Characterized within the Report as “gatekeepers to the app marketplace,” the FTC states that operating system providers have the “greatest ability to effectuate change with respect to improving mobile privacy disclosures.”  Operating system providers, which create and maintain the platform upon which mobile apps run, promulgate rules that app developers must follow in order to access the platform and facilitate interactions between developers and consumers.  Given their prominent role within the app marketplace, it is not surprising that the FTC directs numerous recommendations toward operating system providers, including:

  • Just-In-Time Disclosures.  The Report urges operating system providers to display just-in-time disclosures to consumers and obtain express, opt-in (rather than implied) consent before allowing apps to access sensitive information like geolocation (i.e., the real world physical location of a mobile device), and other information that consumers may find sensitive, such as contacts, photos, calendar entries or recorded audio or video.  Thus, operating system providers and mobile app developers should carefully consider the types of personal information practices that require an opt-in rather than mere use of the app to evidence consent.
  • Privacy Dashboard.  The Report suggests that operating system providers should consider developing a privacy “dashboard” that would centralize privacy settings for various apps to allow consumers to easily review the types of information accessed by the apps they have downloaded.  The “dashboard” model would enable consumers to determine which apps have access to different types of information about the consumer or the consumer’s device and to revisit the choices they initially made about the apps.
  • Icons.  The Report notes that operating system providers currently use status icons for a variety of purposes, such as indicating when an app is accessing geolocation information.  The FTC suggests expansion of this practice to provide an icon that would indicate the transmission of personal information or other information more broadly.
  • Best Practices.  The Report recommends that operating system providers establish best practices for app developers.  For example, operating system providers can compel app developers to make privacy disclosures to consumers by restricting access to their platforms.
  • Review of Apps.  The Report suggests that operating system providers should also make clear disclosures to consumers about the extent to which they review apps developed for their platforms.  Such disclosures may include conditions for making apps available within the platform’s app marketplace and efforts to ensure continued compliance.
  • Do Not Track Mechanism.  The Report directs operating system providers to consider offering a “Do Not Track” (DNT) mechanism, which would provide consumers with the option to prevent tracking by advertising networks or other third parties as they use apps on their mobile devices.  This approach allows consumers to make a single election, rather than case-by-case decisions for each app.

App Developers

Although some practices may be imposed upon app developers by operating system providers, as discussed above, app developers can take several steps to adopt the FTC’s recommendations, including:

  • Privacy Policies.  The FTC encourages all app developers to have a privacy policy, and to include reference to such policy when submitting apps to an operating system provider.
  • Just-In-Time Disclosures.  As with the recommendations for operating system providers, the Report suggests that app developers provide just-in-time disclosures and obtain affirmative express consent before collecting and sharing sensitive information.
  • Coordination with Advertising Networks.  The FTC argues for improved coordination and communication between app developers and advertising networks and other third parties that provide certain functions, such as data analytics, to ensure app developers have an adequate understanding of the software they are incorporating into their apps and can accurately describe such software to consumers.
  • Participation in Trade Associations.  The Report urges app developers to participate in trade associations and other industry organizations, particularly in the development of self-regulatory programs addressing privacy in mobile apps.

Advertising Networks and Other Third Parties

By specifically including advertising networks and other third parties in the Report, the FTC recognizes that cooperation with such networks and parties is necessary to achieve the recommendations outlined for operating system providers and app developers.  The recommendations for advertising networks and other third parties include:

  • Coordination with App Developers.  The Report calls upon advertising networks and other third parties to communicate with app developers to enable such developers to provide accurate disclosures to consumers.
  • DNT Mechanism.  Consistent with its recommendations for operating system providers, the FTC suggests that advertising networks and other third parties work with operating system providers to implement a DNT mechanism.

Trade Associations

The FTC states that trade associations can facilitate standardized privacy disclosures.  The Report makes the following recommendations for trade associations:

  • Icons.  Trade associations can work with operating system providers to develop standardized icons to indicate the transmission of personal information and other data.
  • Badges.  Similar to icons, the Report suggests that trade associations consider developing “badges” or other visual cues used to convey information about a particular app’s data practices.
  • Privacy Policies.  Finally, the FTC suggests that trade associations are uniquely positioned to explore other opportunities to standardize privacy policies across the mobile app industry.

Children and Mobile Apps

Commenting on progress between the February 2012 Report and December 2012 Report, both of which relied on a survey of 400 mobile apps targeted at children, the FTC stated that “little or no progress has been made” in increasing transparency in the mobile app industry with regard to privacy practices specific to children.  The December 2012 Report suggests that very few mobile apps targeted to children include basic information about the app’s privacy practices and interactive features, including the type of data collected, the purpose of the collection and whether third parties have access to such data:

  • Privacy Disclosures.  According to the December 2012 Report, approximately 20 percent of the mobile apps reviewed disclosed any privacy-related information prior to the download process and the same proportion provided access to a privacy disclosure after downloading the app.  Among those mobile apps, the December 2012 Report characterizes their disclosures as lengthy, difficult to read or lacking basic detail, such as the specific types of information collected.
  • Information Collection and Sharing Practices.  The December 2012 Report notes that 59 percent of the mobile apps transmitted some information to the app developer or to a third party.  Unique device identifiers were the most frequently transmitted data point, which the December 2012 Report cites as problematic, suggesting that such identifiers are routinely used to create user “profiles,” which may track consumers across multiple mobile apps.
  • Disclosure Practices Regarding Interactive App Features.  The FTC reports that nearly half of the apps that stated they did not include advertising actually contained advertising, including ads targeted to a mature audience.  Similarly, the December 2012 Report notes that approximately 9 percent of the mobile apps reviewed disclosed that they linked with social media applications; however, this number represented only half of the mobile apps that actually linked to social media applications.  Mobile app developers using a template privacy policy as a starting point for an app’s privacy policy should carefully tailor the template to reflect the developer’s actual privacy practices for the app.

Increased Enforcement

In addition to the reports discussed above and the revisions to the COPPA Rule, effective July 1, 2013, the FTC has also increased enforcement efforts relating to mobile app privacy.  On February 1, 2013, the FTC announced an agreement with Path Inc., operator of the Path social networking mobile app, to settle allegations that it deceived consumers by collecting personal information from their mobile device address books without their knowledge or consent.  Under the terms of the agreement, Path Inc. must establish a comprehensive privacy program, obtain independent privacy assessments every other year for the next 20 years and pay $800,000 in civil penalties specifically relating to alleged violations of the COPPA Rule.  In announcing the agreement, the FTC commented on its commitment to continued scrutiny of privacy practices within the mobile app industry, adding that “no matter what new technologies emerge, the [FTC] will continue to safeguard the privacy of Americans.”

Key Takeaways

App developers and other key stakeholders should consider the following next steps:

  • Review existing privacy policies to confirm they accurately describe current privacy practices for the particular app rather than merely following the developer’s preferred template privacy policy
  • Where practical, update actual privacy practices and privacy policies to be more in line with the FTC’s expectations for transparency and consumer choice, including use of opt-in rather than opt-out consent models
  • Revisit privacy practices in light of heightened FTC enforcement under COPPA and its other consumer protection authorities

© 2013 McDermott Will & Emery

FTC Proposes New Rules on Children’s Online Privacy Issues

Michelle Cohen of Ifrah Law recently had an article regarding Children’s Online Privacy published in The National Law Review:

On August 1, 2012, the Federal Trade Commission announced that is issuing a Supplemental Notice of Proposed Rulemaking to modify certain of its rules under the Children’s Online Privacy Protection Act (COPPA). Industry has been waiting on FTC action regarding COPPA, as the agency previously undertook a COPPA rulemaking in September 2011 and proposed modifying certain COPPA rules to account for changes in technology, particularly mobile technology.

The FTC received over 350 comments during that time. After reviewing those comments, the FTC has decided to propose certain additional changes to its COPPA rule definitions.

In summary, COPPA gives parents control over the information websites can collect from their kids. It applies to websites designed for children under 13 – or those that have reason to know they are collecting information from a child. It requires a specific privacy notice and that consent be obtained from parents in many circumstances before children’s information may be collected and/or used.

The FTC has proposed several changes that are of interest. Some are meant to “tighten” the COPPA rule, others are meant to provide some additional flexibility to operators.

  • The proposed change would make clear that an operator that chooses to integrate the services of third parties that collect personal information from visitors (like ad networks or plug-ins) would itself be considered a covered “operator” under the Rule.
  • The FTC is also proposing to allow websites with mixed audiences (e.g., parents and over 13) to age-screen visitors to provide COPPA’s protections only to those under 13. However, kid-directed sites or services that knowingly target under-13s as their primary audience or whose overall content is likely to attract kids under that age could not use that method.
  • Also, the FTC has proposed modifying the definition of what constitutes “personal information” relating to children to make it clear that a persistent identifier falls within that definition if it can be used to recognize a user over time or across different sites or services. The FTC is considering whether activities like site maintenance and analysis, use of persistent identifiers for authenticating users, maintaining user preferences, serving contextual ads, and protecting against fraud and theft should not be considered the collection of “personal information” as long what’s collected is not used or disclosed to contact a specific individual, including through the use of behaviorally-targeted advertising.

Comments on the FTC’s proposed rule changes are due by September 10, 2012.

© 2012 Ifrah PLLC