FTC Social Media Staff Report Suggests Enforcement Direction and Expectations

The FTC’s staff report summarizes how it views the operations of social media and video streaming companies. Of particular interest is the insight it gives into potential enforcement focus in the coming months, and into 2025. Of particular concern for the FTC in the report, issued last month, were the following:

  1. The high volume of information collected from users, including in ways they may not expect;
  2. Companies relying on advertising revenue that was based on use of that information;
  3. Use of AI over which the FTC felt users did not have control; and
  4. A gap in protection of teens (who are not subject to COPPA).

As part of its report, the FTC recommended changes in how social media companies collect and use personal information. Those recommendations stretched over five pages of the report and fell into four categories. Namely:

  1. Minimizing what information is collected to that which is needed to provide the company’s services. This recommendation also folded in concepts of data deletion and limits on information sharing.
  2. Putting guardrails around targeted digital advertising. Especially, the FTC indicated, if the targeting is based on use of sensitive personal information.
  3. Providing users with information about how automated decisions are being made. This would include not just transparency, the FTC indicated, but also having “more stringent testing and monitoring standards.”
  4. Using COPPA as a baseline in interactions with not only children under 13, but also as a model for interacting with teens.

The FTC also signaled in the report its support of federal privacy legislation that would (a) limit “surveillance” of users and (b) give consumers the type of rights that we are seeing passed at a state level.

Putting it into Practice: While this report was directed at social media companies, the FTC recommendations can be helpful for all entities. They signal the types of safeguards and restrictions that the agency is beginning to expect when companies are using large amounts of personal data, especially that of children and/or within automated decision-making tools like AI.

Listen to this post 

U.S. Sues TikTok for Children’s Online Privacy Protection Act (COPPA) Violations

On Friday, August 2, 2024, the United States sued ByteDance, TikTok, and its affiliates for violating the Children’s Online Privacy Protection Act of 1998 (“COPPA”) and the Children’s Online Privacy Protection Rule (“COPPA Rule”). In its complaint, the Department of Justice alleges TikTok collected, stored, and processed vast amounts of data from millions of child users of its popular social media app.

In June, the FTC voted to refer the matter to the DOJ, stating that it had determined there was reason to believe TikTok (f.k.a. Musical.ly, Inc.) had violated a FTC 2019 consent order and that the agency had also uncovered additional potential COPPA and FTC Act violations. The lawsuit filed today in the Central District of California, alleges that TikTok is directed to children under age 13, that Tik Tok has permitted children to evade its age gate, that TikTok has collected data from children without first notifying their parents and obtaining verifiable parental consent, that TikTok has failed to honor parents’ requests to delete their children’s accounts and information, and that TikTok has failed to delete the accounts and information of users the company knows are children. The complaint also alleges that TikTok failed to comply with COPPA even for accounts in the platform’s “Kids Mode” and that TikTok improperly amassed profiles on Kids Mode users. The complaint seeks civil penalties of up to $51,744 per violation per day from January 10, 2024, to present for the improper collection of children’s data, as well as permanent injunctive relief to prevent future violations of the COPPA Rule.

The lawsuit comes on the heels of the U.S. Senate passage this week of the Children and Teens’ Online Privacy Protection Act (COPPA 2.0) and the Kids Online Safety Act (KOSA) by a 91-3 bipartisan vote. It is unknown whether the House will take up the bills when it returns from recess in September.

American Privacy Rights Act Advances with Significant Revisions

On May 23, 2024, the U.S. House Committee on Energy and Commerce Subcommittee on Data, Innovation, and Commerce approved a revised draft of the American Privacy Rights Act (“APRA”), which was released just 36 hours before the markup session. With the subcommittee’s approval, the APRA will now advance to full committee consideration. The revised draft includes several notable changes from the initial discussion draft, including:

  • New Section on COPPA 2.0 – the revised APRA draft includes the Children’s Online Privacy Protection Act (COPPA 2.0) under Title II, which differs to a certain degree from the COPPA 2.0 proposal currently before the Senate (e.g., removal of the revised “actual knowledge” standard; removal of applicability to teens over age 12 and under age 17).
  • New Section on Privacy By Design – the revised APRA draft includes a new dedicated section on privacy by design. This section requires covered entities, service providers and third parties to establish, implement, and maintain reasonable policies, practices and procedures that identify, assess and mitigate privacy risks related to their products and services during the design, development and implementation stages, including risks to covered minors.
  • Expansion of Public Research Permitted Purpose – as an exception to the general data minimization obligation, the revised APRA draft adds another permissible purpose for processing data for public or peer-reviewed scientific, historical, or statistical research projects. These research projects must be in the public interest and comply with all relevant laws and regulations. If the research involves transferring sensitive covered data, the revised APRA draft requires the affirmative express consent of the affected individuals.
  • Expanded Obligations for Data Brokers – the revised APRA draft expands obligations for data brokers by requiring them to include a mechanism for individuals to submit a “Delete My Data” request. This mechanism, similar to the California Delete Act, requires data brokers to delete all covered data related to an individual that they did not collect directly from that individual, if the individual so requests.
  • Changes to Algorithmic Impact Assessments – while the initial APRA draft required large data holders to conduct and report a covered algorithmic impact assessment to the FTC if they used a covered algorithm posing a consequential risk of harm to individuals, the revised APRA requires such impact assessments for covered algorithms to make a “consequential decision.” The revised draft also allows large data holders to use certified independent auditors to conduct the impact assessments, directs the reporting mechanism to NIST instead of the FTC, and expands requirements related to algorithm design evaluations.
  • Consequential Decision Opt-Out – while the initial APRA draft allowed individuals to invoke an opt-out right against covered entities’ use of a covered algorithm making or facilitating a consequential decision, the revised draft now also allows individuals to request that consequential decisions be made by a human.
  • New and/or Revised Definitions – the revised APRA draft’s definition section includes new terms, such as “contextual advertising” and “first party advertising.”. The revised APRA draft also redefines certain terms, including “covered algorithm,” “sensitive covered data,” “small business” and “targeted advertising.”

WW International to Pay $1.5 Million Civil Penalty for Alleged COPPA Violations

In 2014, with childhood obesity on the rise in the United States, tech company Kurbo, Ltd. (Kurbo) marketed a free app for kids that, according to the company, was “designed to help kids and teens ages 8-17 reach a healthier weight.” When WW International (WW) (formerly Weight Watchers) acquired Kurbo in 2018, the app was rebranded “Kurbo by WW,” and WW continued to market the app to children as young as eight. But according to the Federal Trade Commission (FTC), Kurbo’s privacy practices were not exactly child-friendly, even if its app was. The FTC’s complaint, filed by the Department of Justice (DOJ) last month, claims that WW’s notice, data collection, and data retention practices violated the Children’s Online Privacy Protection Act Rule (COPPA Rule). WW and Kurbo, under a stipulated order, agreed to pay a $1.5 million civil penalty in addition to complying with a range of injunctive provisions. These provisions include, but are not limited to, deleting all personal information of children whose parents did not provide verifiable parental consent in a specified timeframe, and deleting “Affected Work Product” (defined in the order to include any models or algorithms developed in whole or in part using children’s personal information collected through the Kurbo Program).

Complaint Background

The COPPA Rule applies to any operator of a commercial website or online service directed to children that collects, uses, and/or discloses personal information from children and to any operator of a commercial website or online service that has actual knowledge that it collects, uses, and/or discloses personal information from children. Operators must notify parents and obtain their consent before collecting, using, or disclosing personal information from children under 13.

The complaint states that children enrolled in the Kurbo app by signing up through the app or having a parent do it on their behalf. Once on Kurbo, users could enter personal information such as height, weight, and age, and the app then tracked their weight, food consumption, and exercise. However, the FTC alleges that Kurbo’s age gate was porous, requiring no verification process to establish that children who affirmed they were over 13 were the age they claimed to be or that users asserting they were parents were indeed parents. In fact, the complaint alleges that the registration area featured a “tip-off” screen that gave visitors just two choices for registration: the “I’m a parent” option or the “I’m at least 13” option. Visitors saw the legend, “Per U.S. law, a child under 13 must sign up through a parent” on the registration page featuring these choices. In fact, thousands of users who indicated that they were at least 13 were younger and were able to change their information and falsify their real age. Users who lied about their age or who falsely claimed to be parents were able to continue to use the app. In 2020, after a warning from the FTC, Kurbo implemented a registration screen that removed the legend and the “at least 13” option. However, the new process failed to provide verification measures to establish that users claiming to be parents were indeed parents.

Kurbo’s notice of data collection and data retention practices also fell short. The COPPA Rule requires an operator to “post a prominent and clearly labeled link to an online notice of its information practices with regard to children on the home or landing page or screen of its Web site or online service, and, at each area of the Web site or online service where personal information is collected from children.” But beginning in November 2019, Kurbo’s notice at registration was buried in a list of hyperlinks that parents were not required to click through, and the notice failed to list all the categories of information the app collected from children. Further, Kurbo did not comply with the COPPA Rule’s mandate to keep children’s personal information only as long as reasonably necessary for the purpose it was collected and then to delete it. Instead, the company held on to personal information indefinitely unless parents specifically requested its removal.

Stipulated Order

In addition to imposing a $1.5 million civil penalty, the order, which was approved by the court on March 3, 2022, requires WW and Kurbo to:

  • Refrain from disclosing, using, or benefitting from children’s personal information collected in violation of the COPPA Rule;
  • Delete all personal information Kurbo collected in violation of the COPPA Rule within 30 days;
  • Provide a written statement to the FTC that details Kurbo’s process for providing notice and seeking verifiable parental consent;
  • Destroy all affected work product derived from improperly collecting children’s personal information and confirm to the FTC that deletion has been carried out;
  • Delete all children’s personal information collected within one year of the user’s last activity on the app; and
  • Create and follow a retention schedule that states the purpose for which children’s personal information is collected, the specific business need for retaining such information, and criteria for deletion, including a set timeframe no longer than one year.

Implications of the Order

Following the U.S. Supreme Court’s decision in AMG Capital Management, LLC v. Federal Trade Commission, which halted the FTC’s ability to use its Section 13(b) authority to seek monetary penalties for violations of the FTC Act, the FTC has been pushing Congress to grant it greater enforcement powers. In the meantime, the FTC has used other enforcement tools, including the recent resurrection of the agency’s long-dormant Penalty Offense Authority under Section 5(m)(1)(B) of the FTC Act and a renewed willingness to use algorithmic disgorgement (which the FTC first applied in the 2019 Cambridge Analytica case).

Algorithmic disgorgement involves “requir[ing] violators to disgorge not only the ill-gotten data, but also the benefits—here, the algorithms—generated from that data,” as then-Acting FTC Chair Rebecca Kelly Slaughter stated in a speech last year. This order appears to be the first time algorithmic disgorgement was applied by the Commission in an enforcement action under COPPA.

Children’s privacy issues continue to attract the attention of the FTC and lawmakers at both federal and state levels. Companies that collect children’s personal information should be careful to ensure that their privacy policies and practices fully conform to the COPPA Rule.

© 2022 Keller and Heckman LLP

Recent COPPA Settlements Offer Compliance Reminders

The recently announced FTC settlement with YouTube and its parent company, as well as the 2018 settlement between the New York Office of the Attorney General and Oath, have set a new bar when it comes to COPPA compliance.

The settlements offer numerous takeaways, including reminders to those that use persistent identifiers to track children online and deliver them targeted ads.  These takeaways include, but are not limited to the following.

FTC CID attorney Joseph Simons stated that “YouTube touted its popularity with children to prospective corporate clients … yet when it came to complying with COPPA, the company refused to acknowledge that portions of its platform were clearly directed to kids.”

First, under COPPA, a child-directed website or online service – or a site that has actual knowledge it’s collecting or maintaining personal information from a child – must give clear notice on its site of “what information it collects from children, how it uses such information and its disclosure practices for such information.”

Second, the website or service must give direct notice to parents of their practices “with regard to the collection, use, or disclosure of personal information from children.”

Third, prior to collecting personal information from children under 13, COPPA-covered companies must get verifiable parental consent.

COPPA’s definition of “personal information” specifically includes persistent identifiers used for behavioral advertising.  It is critical to note that third-party platforms are subject to COPPA when they have actual knowledge they are collecting personal information from users of a child-directed website.

In March 2019, the FTC handed down what, then, was the largest civil penalty ever for violations of COPPA following allegations that Musical.ly knew many of its users were children and still failed to seek parental consent.  There, the FTC charged that Musical.ly failed to provide notice on their website of the information they collect online from children, how they use it and their disclosure practices; failed to provide direct notice to parents; failed to obtain consent from parents before collecting personal information from children; failed to honor parents’ requests to delete personal information collected from children; and retained personal information for longer than reasonably necessary.

Content creators must know COPPA’s requirements.

If a platform hosting third-party content knows that content is directed to children, it is unlawful to collect personal information from viewers without getting verifiable parental consent.

While it may be fine for most commercial websites geared to a general audience to include a corner for children, it that portion of the website collects information from users, COPPA obligations are triggered.

Comprehensive COPPA policies and procedures to protect children’s privacy are a good idea.  As are competent oversight, COPPA training for relevant personnel, the identification of risks that could result in violations of COPPA, the design and implementation of reasonable controls to address the identified risks, the regular monitoring of the effectiveness of those controls, and the development and use of reasonable steps to select and retain service providers that can comply with COPPA.

The FTC and the New York Attorney General are serious about COPPA enforcement.  Companies should exercise caution with respect to such data collection practices.



© 2019 Hinch Newman LLP

FTC Settlement with Video Social Networking App Largest Civil Penalty in a Children’s Privacy Case

The Federal Trade Commission (FTC) announced a settlement with Musical.ly, a Cayman Islands corporation with its principal place of business in Shanghai, China, resolving allegations that the defendants violated the Children’s Online Privacy Protection Act (COPPA) Rule.

Musical.ly operates a video social networking app with 200 million users worldwide and 65 million in the United States. The app provides a platform for users to create short videos of themselves or others lip-syncing to music and share those videos with other users. The app also provides a platform for users to connect and interact with other users, and until October 2016 had a feature that allowed a user to tap on the “my city” tab and receive a list of other users within a 50-mile radius.

According to the complaint the defendants (1) were aware that a significant percentage of users were younger than 13 years of age and (2) had received thousands of complaints from parents that their children under 13 had created Muscial.ly accounts.

The FTC’s COPPA Rule prohibits the unauthorized or unnecessary collection of children’s personal information online by internet website operators and online services, and requires that verifiable parental consent be obtained prior to the collecting, using, and/or disclosing personal information of children under the age of 13.

In addition to requiring the payment of the largest civil penalty ever imposed for a COPPA case ($5.7 million), the consent decree prohibits the defendants from violating the COPPA Rule and requires that they delete and destroy all of the personal information of children in their possession, custody, or control unless verifiable parental consent has been obtained.

FTC Commissioners Chopra and Slaughter issued a joint statement noting their belief that the FTC should prioritize uncovering the role of corporate officers and directors and hold accountable everyone who broke the law.

 

©2019 Drinker Biddle & Reath LLP. All Rights Reserved

Happy Holidays: VTech Data Breach Affects Over 11 million Parents and Children Worldwide

The recent data breach of Hong Kong-based electronic toy manufacturer VTech Holdings Limited (“VTech” or the “Company”) is making headlines around the world for good reason: it exposed sensitive personal information of over 11 million parents and children users of VTech’s Learning Lodge app store, Kid Connect network, and PlanetVTech in 16 countries! VTech’s Learning Lodge website allows customers to download apps, games, e-books and other educational content to their VTech products, the Kid Connect network allows parents using a smartphone app to chat with their children using a VTech tablet, and PlanetVTech is an online gaming site. As of December 3rd, VTech has suspended all its Learning Lodge sites, the KidConnect network and thirteen other websites pending investigation.

VTech announced the cyberattack on November 27th by press release and has since issued follow-on press releases on November 30th and December 3rd, noting that “the Learning Lodge, Kid Connect and PlanetVTech databases have been attacked by a skilled hacker” and that the Company is “deeply shocked by this orchestrated and sophisticated attack.” According to the various press releases, upon learning of the cyber attack, VTech “conducted a comprehensive check of the affected site” and has “taken thorough actions against future attacks.” The Company has reported that it is currently working with FireEye’s Mandiant Incident Response services and with law enforcement worldwide to investigate the attack. According to VTech’s latest update on the incident:

  • 4, 854, 209 parent Learning Lodge accounts containing the following information were affected: name, email address, secret question and answer for password retrieval, IP address, mailing address, download history and encrypted passwords;

  • 6,368,509 children profile containing the following information were affected: name, gender, and birthdate were affected. 1.2 million of the affected profiles have enabled the Kid Connect App, meaning that the hackers could also have access to profile photos and undelivered Kid Connect chat messages;

  • The compromised databases also include encrypted Learning Lodge content (bulletin board postings, ebooks, apps, games etc.), sales report logs and progress logs to track games, but, it did not include credit card, debit card or other financial account information or Social Security numbers, driver’s license numbers, or ID card numbers; and

  • The affected individuals are located in the following countries: USA, Canada, United Kingdom, Republic of Ireland, France, Germany, Spain, Belgium, the Netherlands, Denmark, Luxembourg, Latin America, Hong Kong, China, Australia and New Zealand. The largest number of affected individuals are reported in the U.S. (2,212,863 parent accounts and 2,894,091 children profiles), France (868,650 parent accounts and 1,173,497 children profiles), the UK (560,487 parent accounts and 727,155 children profiles), and Germany (390,985 parent accounts and 508,806 children profiles).

Given the magnitude and wide territorial reach of the VTech cyber attack, the incident is already on the radar of regulators in Hong Kong and at least two attorneys general in the United States. On December 1, the Hong Kong Office of the Privacy Commissioner for Personal Data announced that it has initiated “a compliance check on the data leakage incident” of VTech Learning Lodge.  In addition, on December 3rd, two separate class actions have already been filed against VTech  Electronics North America, L.L.C. and VTech Holdings Limited in the Northern District of Illinois.  Since the data breach compromised personal information of children located in the United States (first and last name, photographs, online contact information, etc.), it is likely that the Federal Trade Commission (FTC) will investigate VTech’s compliance with the Children’s Online Privacy Protection Act (“COPPA”) and its implementing rule (as amended, the “COPPA Rule”). If a COPPA violation is found, the civil penalties can be steep and go up to $16,000 per violation. In addition to civil penalties imposed by a court, the FTC can require an entity to implement a comprehensive privacy program and to obtain regular, independent privacy assessments for a period of time.

©1994-2015 Mintz, Levin, Cohn, Ferris, Glovsky and Popeo, P.C. All Rights Reserved.

FTC Releases Extensive Report on the “Internet of Things”

Mcdermott Will Emery Law Firm

On January 27, 2015, U.S. Federal Trade Commission (FTC) staff released an extensive report on the “Internet of Things” (IoT). The report, based in part on input the FTC received at its November 2013 workshop on the subject, discusses the benefits and risks of IoT products to consumers and offers best practices for IoT manufacturers to integrate the principles of security, data minimization, notice and choice into the development of IoT devices. While the FTC staff’s report does not call for IoT specific legislation at this time, given the rapidly evolving nature of the technology, it reiterates the FTC’s earlier recommendation to Congress to enact strong federal data security and breach notification legislation.

The report also describes the tools the FTC will use to ensure that IoT manufacturers consider privacy and security issues as they develop new devices. These tools include:

  • Enforcement actions under such laws as the FTC Act, the Fair Credit Reporting Act (FCRA) and the Children’s Online Privacy Protection Act (COPPA), as applicable;

  • Developing consumer and business education materials in the IoT area;

  • Participation in multi-stakeholder groups considering guidelines related to IoT; and

  • Advocacy to other agencies, state legislatures and courts to promote protections in this area.

In furtherance of its initiative to provide educational materials on IoT for businesses, the FTC also announced the publication of “Careful Connections: Building Security in the Internet of Things”.  This site provides a wealth of advice and resources for businesses on how they can go about meeting the concept of “security by design” and consider issues of security at every stage of the product development lifecycle for internet-connected devices and things.

This week’s report is one more sign pointing toward our prediction regarding the FTC’s increased activity in the IoT space in 2015.

FTC Denies AgeCheq Parental Consent Application But Trumpets General Support for COPPA Common Consent Mechanisms

Covington BUrling Law Firm

The Federal Trade Commission (“FTC”) recently reiterated its support for the use of “common consent” mechanisms that permit multiple operators to use a single system for providing notices and obtaining verifiable consent under the Children’s Online Privacy Protection Act (“COPPA”). COPPA generally requires operators of websites or online services that are directed to children under 13 or that have actual knowledge that they are collecting personal information from children under 13 to provide notice and obtain verifiable parental consent before collecting, using, or disclosing personal information from children under 13.   The FTC’s regulations implementing COPPA (the “COPPA Rule”) do not explicitly address common consent mechanisms, but in the Statement of Basis and Purpose accompanying 2013 revisions to the COPPA Rule, the FTC stated that “nothing forecloses operators from using a common consent mechanism as long as it meets the Rule’s basic notice and consent requirements.”

The FTC’s latest endorsement of common consent mechanisms appeared in a letter explaining why the FTC was denying AgeCheq, Inc.’s application for approval of a common consent method.  The COPPA Rule establishes a voluntary process whereby companies may submit a formal application to have new methods of parental consent considered by the FTC.  The FTC denied AgeCheq’s application because it “incorporates methods already enumerated” in the COPPA Rule: (1) a financial transaction, and (2) a print-and-send form.   The implementation of these approved methods of consent in a common consent mechanism was not enough to merit a separate approval from the FTC .  According to the FTC, the COPPA Rule’s new consent approval process was intended to vet new methods of obtaining verifiable parental consent rather than specificimplementations of approved methods.  While AgeCheq’s application was technically “denied,” the FTC emphasized that AgeCheq and other “[c]ompanies are free to develop common consent mechanisms without applying to the Commission for approval.”  In support of common consent mechanisms, the FTC quoted language from the 2013 Statement of Basis and Purpose and pointed out that at least one COPPA Safe Harbor program already relies on a common consent mechanism.

OF

Federal Trade Commission (FTC) Recommends Privacy Practices for Mobile Apps

The National Law Review recently published an article, Federal Trade Commission (FTC) Recommends Privacy Practices for Mobile Apps, written by Daniel F. GottliebRandall J. Ortman, and Heather Egan Sussman with McDermott Will & Emery:

McDermottLogo_2c_rgb

On February 1, 2013, the Federal Trade Commission (FTC) released a report entitled “Mobile Privacy Disclosures: Building Trust Through Transparency” (Report), which urges mobile device application (app) platforms and developers to improve the privacy policies for their apps to better inform consumers about their privacy practices.  This report follows other recent publications from the FTC concerning mobile apps—including “Mobile Apps for Kids: Disclosures Still Not Making the Grade,” released December 2012 (December 2012 Report), and “Mobile Apps for Kids: Current Privacy Disclosures are Disappointing,” released February 2012 (February 2012 Report)—and the adoption of the amended Children’s Online Privacy Protection Act (COPPA) Rule on December 19, 2012.  (See “FTC Updates Rule for Children’s Online Privacy Protection” for more information regarding the recent COPPA amendments.

Among other things, the Report offers recommendations to key stakeholders in the mobile device application marketplace, particularly operating system providers (e.g., Apple and Microsoft), application developers, advertising networks and related trade associations.  Such recommendations reflect the FTC’s enforcement and policy experience with mobile applications and public comment on the matter; however, where the Report goes beyond existing legal requirements, “it is not intended to serve as a template for law enforcement actions or regulations under laws currently enforced by the FTC.”  Nevertheless, such key stakeholders should take the FTC’s recommendations into account when determining how they will collect, use and transfer personal information about consumers and preparing privacy policies to describe their information practices because they reflect the FTC’s expectations under its consumer protection authorities.

At a minimum, operating system providers and application developers should review their existing privacy policies and make revisions, as necessary, to comply with the recommendations included within the Report.  However, all key stakeholders should consider the implications of recommendations specific to their industry segment, as summarized below.

Operating System Providers

Characterized within the Report as “gatekeepers to the app marketplace,” the FTC states that operating system providers have the “greatest ability to effectuate change with respect to improving mobile privacy disclosures.”  Operating system providers, which create and maintain the platform upon which mobile apps run, promulgate rules that app developers must follow in order to access the platform and facilitate interactions between developers and consumers.  Given their prominent role within the app marketplace, it is not surprising that the FTC directs numerous recommendations toward operating system providers, including:

  • Just-In-Time Disclosures.  The Report urges operating system providers to display just-in-time disclosures to consumers and obtain express, opt-in (rather than implied) consent before allowing apps to access sensitive information like geolocation (i.e., the real world physical location of a mobile device), and other information that consumers may find sensitive, such as contacts, photos, calendar entries or recorded audio or video.  Thus, operating system providers and mobile app developers should carefully consider the types of personal information practices that require an opt-in rather than mere use of the app to evidence consent.
  • Privacy Dashboard.  The Report suggests that operating system providers should consider developing a privacy “dashboard” that would centralize privacy settings for various apps to allow consumers to easily review the types of information accessed by the apps they have downloaded.  The “dashboard” model would enable consumers to determine which apps have access to different types of information about the consumer or the consumer’s device and to revisit the choices they initially made about the apps.
  • Icons.  The Report notes that operating system providers currently use status icons for a variety of purposes, such as indicating when an app is accessing geolocation information.  The FTC suggests expansion of this practice to provide an icon that would indicate the transmission of personal information or other information more broadly.
  • Best Practices.  The Report recommends that operating system providers establish best practices for app developers.  For example, operating system providers can compel app developers to make privacy disclosures to consumers by restricting access to their platforms.
  • Review of Apps.  The Report suggests that operating system providers should also make clear disclosures to consumers about the extent to which they review apps developed for their platforms.  Such disclosures may include conditions for making apps available within the platform’s app marketplace and efforts to ensure continued compliance.
  • Do Not Track Mechanism.  The Report directs operating system providers to consider offering a “Do Not Track” (DNT) mechanism, which would provide consumers with the option to prevent tracking by advertising networks or other third parties as they use apps on their mobile devices.  This approach allows consumers to make a single election, rather than case-by-case decisions for each app.

App Developers

Although some practices may be imposed upon app developers by operating system providers, as discussed above, app developers can take several steps to adopt the FTC’s recommendations, including:

  • Privacy Policies.  The FTC encourages all app developers to have a privacy policy, and to include reference to such policy when submitting apps to an operating system provider.
  • Just-In-Time Disclosures.  As with the recommendations for operating system providers, the Report suggests that app developers provide just-in-time disclosures and obtain affirmative express consent before collecting and sharing sensitive information.
  • Coordination with Advertising Networks.  The FTC argues for improved coordination and communication between app developers and advertising networks and other third parties that provide certain functions, such as data analytics, to ensure app developers have an adequate understanding of the software they are incorporating into their apps and can accurately describe such software to consumers.
  • Participation in Trade Associations.  The Report urges app developers to participate in trade associations and other industry organizations, particularly in the development of self-regulatory programs addressing privacy in mobile apps.

Advertising Networks and Other Third Parties

By specifically including advertising networks and other third parties in the Report, the FTC recognizes that cooperation with such networks and parties is necessary to achieve the recommendations outlined for operating system providers and app developers.  The recommendations for advertising networks and other third parties include:

  • Coordination with App Developers.  The Report calls upon advertising networks and other third parties to communicate with app developers to enable such developers to provide accurate disclosures to consumers.
  • DNT Mechanism.  Consistent with its recommendations for operating system providers, the FTC suggests that advertising networks and other third parties work with operating system providers to implement a DNT mechanism.

Trade Associations

The FTC states that trade associations can facilitate standardized privacy disclosures.  The Report makes the following recommendations for trade associations:

  • Icons.  Trade associations can work with operating system providers to develop standardized icons to indicate the transmission of personal information and other data.
  • Badges.  Similar to icons, the Report suggests that trade associations consider developing “badges” or other visual cues used to convey information about a particular app’s data practices.
  • Privacy Policies.  Finally, the FTC suggests that trade associations are uniquely positioned to explore other opportunities to standardize privacy policies across the mobile app industry.

Children and Mobile Apps

Commenting on progress between the February 2012 Report and December 2012 Report, both of which relied on a survey of 400 mobile apps targeted at children, the FTC stated that “little or no progress has been made” in increasing transparency in the mobile app industry with regard to privacy practices specific to children.  The December 2012 Report suggests that very few mobile apps targeted to children include basic information about the app’s privacy practices and interactive features, including the type of data collected, the purpose of the collection and whether third parties have access to such data:

  • Privacy Disclosures.  According to the December 2012 Report, approximately 20 percent of the mobile apps reviewed disclosed any privacy-related information prior to the download process and the same proportion provided access to a privacy disclosure after downloading the app.  Among those mobile apps, the December 2012 Report characterizes their disclosures as lengthy, difficult to read or lacking basic detail, such as the specific types of information collected.
  • Information Collection and Sharing Practices.  The December 2012 Report notes that 59 percent of the mobile apps transmitted some information to the app developer or to a third party.  Unique device identifiers were the most frequently transmitted data point, which the December 2012 Report cites as problematic, suggesting that such identifiers are routinely used to create user “profiles,” which may track consumers across multiple mobile apps.
  • Disclosure Practices Regarding Interactive App Features.  The FTC reports that nearly half of the apps that stated they did not include advertising actually contained advertising, including ads targeted to a mature audience.  Similarly, the December 2012 Report notes that approximately 9 percent of the mobile apps reviewed disclosed that they linked with social media applications; however, this number represented only half of the mobile apps that actually linked to social media applications.  Mobile app developers using a template privacy policy as a starting point for an app’s privacy policy should carefully tailor the template to reflect the developer’s actual privacy practices for the app.

Increased Enforcement

In addition to the reports discussed above and the revisions to the COPPA Rule, effective July 1, 2013, the FTC has also increased enforcement efforts relating to mobile app privacy.  On February 1, 2013, the FTC announced an agreement with Path Inc., operator of the Path social networking mobile app, to settle allegations that it deceived consumers by collecting personal information from their mobile device address books without their knowledge or consent.  Under the terms of the agreement, Path Inc. must establish a comprehensive privacy program, obtain independent privacy assessments every other year for the next 20 years and pay $800,000 in civil penalties specifically relating to alleged violations of the COPPA Rule.  In announcing the agreement, the FTC commented on its commitment to continued scrutiny of privacy practices within the mobile app industry, adding that “no matter what new technologies emerge, the [FTC] will continue to safeguard the privacy of Americans.”

Key Takeaways

App developers and other key stakeholders should consider the following next steps:

  • Review existing privacy policies to confirm they accurately describe current privacy practices for the particular app rather than merely following the developer’s preferred template privacy policy
  • Where practical, update actual privacy practices and privacy policies to be more in line with the FTC’s expectations for transparency and consumer choice, including use of opt-in rather than opt-out consent models
  • Revisit privacy practices in light of heightened FTC enforcement under COPPA and its other consumer protection authorities

© 2013 McDermott Will & Emery