CFPB Takes Aim at Data Brokers in Proposed Rule Amending FCRA

On December 3, the CFPB announced a proposed rule to enhance oversight of data brokers that handle consumers’ sensitive personal and financial information. The proposed rule would amend Regulation V, which implements the Fair Credit Reporting Act (FCRA), to require data brokers to comply with credit bureau-style regulations under FCRA if they sell income data or certain other financial information on consumers, regardless of its end use.

Should this rule be finalized, the CFPB would be empowered to enforce the FCRA’s privacy protections and consumer safeguards in connection with data brokers who leverage emerging technologies that became prevalent after FCRA’s enactment.

What are some of the implications of the new rule?

  • Data Brokers are Now Considered CRAs. The proposed rule defines the circumstances under which companies handling consumer data would be considered CRAs by clarifying the definition of “consumer reports.” The rule specifies that data brokers selling any of four types of consumer information—credit history, credit score, debt payments, or income/financial tier data—would generally be considered to be selling a consumer report.
  • Assembling Information About Consumers Means You are a CRA. Under the rule, an entity is a CRA if it assembles or evaluates information about consumers, including by collecting, gathering, or retaining; assessing, verifying, validating; or contributing to or altering the content of such information. This view is in step with the Bureau’s recent Circular on AI-based background dossiers of employees. (See our prior discussion here.)
  • Header Information is Now a Consumer Report. Under the proposed rule, communications from consumer reporting agencies of certain personal identifiers that they collect—such as name, addresses, date of birth, Social Security numbers, and phone numbers—would be consumer reports. This would mean that consumer reporting agencies could only sell such information (typically referred to as “credit header” data) if the user had a permissible purpose under the FCRA.
  • Marketing is Not a Legitimate Business Need. The proposed rule emphasizes that marketing is not a “legitimate business need” under the FCRA. Accordingly, CRAs could not use consumer reports to decide for an advertiser which consumers should receive ads and would not be able to send ads to consumers on an advertiser’s behalf.
  • Enhanced Disclosure and Consent Requirements. Under the FCRA, consumers can give their consent to share data. Under the proposed rule, the Bureau clarified that consumers must be provided a clear and conspicuous disclosure stating how their consumer report will be used. It would also require data brokers to acknowledge a consumer’s right to revoke their consent. Finally, the proposed rule requires a new and separate consumer authorization for each product or service authorized by the consumer. The Bureau is focused on instances where a customer signs up for a specific product or service, such as credit monitoring, but then receives targeted marketing for a completely different product.

Comments on the rule must be received on or before March 3, 2025.

Putting It Into Practice: With the release of the rule so close to the end of Director Chopra’s term, it will be interesting to see what a new administration does with it. We expect a new CFPB director to scale back and rescind much of the informal regulatory guidance that was issued by the Biden administration. However, some aspects of the data broker rule have bipartisan support so we may see parts of it finalized in 2025.

FTC Social Media Staff Report Suggests Enforcement Direction and Expectations

The FTC’s staff report summarizes how it views the operations of social media and video streaming companies. Of particular interest is the insight it gives into potential enforcement focus in the coming months, and into 2025. Of particular concern for the FTC in the report, issued last month, were the following:

  1. The high volume of information collected from users, including in ways they may not expect;
  2. Companies relying on advertising revenue that was based on use of that information;
  3. Use of AI over which the FTC felt users did not have control; and
  4. A gap in protection of teens (who are not subject to COPPA).

As part of its report, the FTC recommended changes in how social media companies collect and use personal information. Those recommendations stretched over five pages of the report and fell into four categories. Namely:

  1. Minimizing what information is collected to that which is needed to provide the company’s services. This recommendation also folded in concepts of data deletion and limits on information sharing.
  2. Putting guardrails around targeted digital advertising. Especially, the FTC indicated, if the targeting is based on use of sensitive personal information.
  3. Providing users with information about how automated decisions are being made. This would include not just transparency, the FTC indicated, but also having “more stringent testing and monitoring standards.”
  4. Using COPPA as a baseline in interactions with not only children under 13, but also as a model for interacting with teens.

The FTC also signaled in the report its support of federal privacy legislation that would (a) limit “surveillance” of users and (b) give consumers the type of rights that we are seeing passed at a state level.

Putting it into Practice: While this report was directed at social media companies, the FTC recommendations can be helpful for all entities. They signal the types of safeguards and restrictions that the agency is beginning to expect when companies are using large amounts of personal data, especially that of children and/or within automated decision-making tools like AI.

Listen to this post 

FTC: Three Enforcement Actions and a Ruling

In today’s digital landscape, the exchange of personal information has become ubiquitous, often without consumers fully comprehending the extent of its implications.

The recent actions undertaken by the Federal Trade Commission (FTC) shine a light on the intricate web of data extraction and mishandling that pervades our online interactions. From the seemingly innocuous permission requests of game apps to the purported protection promises of security software, consumers find themselves at the mercy of data practices that blur the lines between consent and exploitation.

The FTC’s proposed settlements with companies like X-Mode Social (“X Mode”) and InMarket, two data aggregators, and Avast, a security software company, underscores the need for businesses to appropriately secure and limit the use of consumer data, including previously considered innocuous information such as browsing and location data. In a world where personal information serves as currency, ensuring consumer privacy compliance has never been more critical – or posed such a commercial risk for failing to get it right.

X-Mode and InMarket Settlements: The proposed settlements with X-Mode and InMarket concern numerous allegations based on the mishandling of consumers’ location data. Both companies supposedly collected precise location data through their own mobile apps and those of third parties (through software development kits).  X-Mode is alleged to have sold precise location data (advertised as being 70% accurate within 20 meters or less) linked to timestamps and unique persistent identifiers (i.e., names, email addresses, etc.) of its consumers to private government contractors without obtaining proper consent. Plotting this data on a map makes it easy to reveal each person’s movements over time.

InMarket purportedly utilized location data to cross-reference such data with points of interest to sort consumers into particularized audience segments for targeted advertising purposes without adequately informing consumers – examples of audience segments include parents of preschoolers, Christian church attendees, and “wealthy and not healthy,” among other groupings.

Avast Settlement: Avast, a security software company, allegedly sold granular and re-identifiable browsing information of its consumers despite assuring consumers it would protect their privacy. Avast allegedly collected extensive browsing data of its consumers through its antivirus software and browser extensions while ensuring its consumers that their browsing data would only be used in aggregated and anonymous form. The data collected by Avast revealed visits to various websites that could be attributed to particular people and allowed for inferences to be drawn about such individuals – examples include academic papers on symptoms of breast cancer, education courses on tax exemptions, government jobs in Fort Meade, Maryland with a salary over $100,000, links to FAFSA applications and directions from one location to another, among others.

Sensitivity of Browsing and Location Data

It is important to note that none of the underlying datasets in question contained traditional types of personally identifiable information (e.g., name, identification numbers, physical descriptions, etc.) (“PII”). Even still, the three proposed settlements by the FTC underscore the sensitive nature of browsing and location data due to the insights such data reveals, such as religious beliefs, health conditions, and financial status, and the ease with which the insights can be linked to certain individuals.

In the digital age, the amount of data available about individuals online and collected by various companies makes the re-identification of individuals easier every day. Even when traditional PII is not included in a data set, by linking sufficient data points, a profile or understanding of an individual can be created. When such profile is then linked to an identifier (such as username, phone number, or email address provided when downloading an app or setting up an account on an app) and cross-referenced with various publicly available data, such as name, email, phone number or content on social media sites, it can allow for deep insights into an individual. Despite the absence of traditional types of PII, such data poses significant privacy risks due to the potential for re-identification and the intimate details about individuals’ lives that it can divulge.

The FTC emphasizes the imperative for companies to recognize and treat browsing and location data as sensitive information and implement appropriate robust safeguards to protect consumer privacy. This is especially true when the data set includes information with the precision of those cited by the FTC in its proposed settlements.

Accountability and Consent

With browsing and location data, there is also a concern that the consumer may not be fully aware of how their data is used. For instance, Avast claimed to protect consumers’ browsing data and then sold that very same browsing information, often without notice to consumers. When Avast did inform customers of their practices, the FTC claims it deceptively stated any sharing would be “anonymous and aggregated.” Similarly, X-Mode claimed it would use location data for ad-personalization and location-based analytics. Consumers were unaware such location data was also sold to government contractors.

The FTC has recognized that a company may need to process an individual’s information to provide them with services or products requested by the individual. The FTC also holds that such processing does not mean the company is then free to collect, access, use, or transfer that information for other purposes (e.g., marketing, profiling, background screening, etc.). Essentially, purpose matters. As the FTC explains, a flashlight app provider cannot collect, use, store, or share a user’s precise geolocation data, or a tax preparation service cannot use a customer’s information to market other products or services.

If companies want to use consumer personal information for purposes other than providing the requested product or services, the FTC states that companies should inform consumers of such uses and obtain consent to do so.

The FTC aims to hold companies accountable for their data-handling practices and ensure that consumers are provided with meaningful consent mechanisms. Companies should handle consumer data only for the purposes for which data was collected and honor their privacy promises to consumers. The proposed settlements emphasize the importance of transparency, accountability, meaningful consent, and the prioritization of consumer privacy in companies’ data handling practices.

Implementing and Maintaining Safeguards

Data, especially specific data that provide insights and inferences about individuals, is extremely valuable to companies, but it is that same data that exposes such individuals’ privacy. Companies that sell or share information sometimes include limitations for the use of the data, but not all contracts have such restrictions or sufficient restrictions to safeguard individuals’ privacy.

For instance, the FTC alleges that some of Avast’s underlying contracts did not prohibit the re-identification of Avast’s users. Where Avast’s underlying contracts prohibited re-identification, the FTC alleges that purchasers of the data were still able to match Avast users’ browsing data with information from other sources if the information was not “personally identifiable.” Avast also failed to audit or confirm that purchasers of data complied with its prohibitions.

The proposed complaint against X-Mode recognized that at least twice, X-Mode sold location data to purchasers who violated restrictions in X-Mode’s contracts by reselling the data they bought from X-Mode to companies further downstream. The X-Mode example shows that even when restrictions are included in contracts, they may not prevent misuse by subsequent downstream parties.

Ongoing Commitment to Privacy Protection:

The FTC stresses the importance of obtaining informed consent before collecting or disclosing consumers’ sensitive data, as such data can violate consumer privacy and expose them to various harms, including stigma and discrimination. While privacy notices, consent, and contractual restrictions are important, the FTC emphasizes they need to be backed up by action. Accordingly, the FTC’s proposed orders require companies to design, implement, maintain, and document safeguards to protect the personal information they handle, especially when it is sensitive in nature.

What Does a Company Need To Do?

Given the recent enforcement actions by the FTC, companies should:

  1. Consider the data it collects and whether such data is needed to provide the services and products requested by the consumer and/or a legitimate business need in support of providing such services and products (e.g., billing, ongoing technical support, shipping);
  2. Consider browsing and location data as sensitive personal information;
  3. Accurately inform consumers of the types of personal information collected by the company, its uses, and parties to whom it discloses the personal information;
  4. Collect, store, use, or share consumers’ sensitive personal information (including browser and location data) only with such consumers’ informed consent;
  5. Limit the use of consumers’ personal information solely to the purposes for which it was collected and not market, sell, or monetize consumers’ personal information beyond such purpose;
  6. Design, Implement, maintain, document, and adhere to safeguards that actually maintain consumers’ privacy; and
  7. Audit and inspect service providers and third-party companies downstream with whom consumers’ data is shared to confirm they are (a) adhering to and complying with contractual restrictions and (b) implementing appropriate safeguards to protect such consumer data.

Another Lesson for Higher Education Institutions about the Importance of Cybersecurity Investment

Key Takeaway

A Massachusetts class action claim underscores that institutions of higher education will continue to be targets for cybercriminals – and class action plaintiffs know it.

Background

On January 4, 2023, in Jackson v. Suffolk University, No. 23-cv-10019, Jackson (Plaintiff) filed a proposed class action lawsuit in the U.S. District Court for the District of Massachusetts against her alma matter, Suffolk University (Suffolk), arising from a data breach affecting thousands of current and former Suffolk students.

The complaint alleges that an unauthorized party gained access to Suffolk’s computer network on or about July 9, 2022.  After learning of the unauthorized access, Suffolk engaged cybersecurity experts to assist in an investigation. Suffolk completed the investigation on November 14, 2022.  The investigation concluded that an unauthorized third party gained access to and/or exfiltrated files containing personally identifiable information (PII) for students who enrolled after 2002.

The complaint further alleges that the PII exposed in the data breach included students’ full names, Social Security Numbers, Driver License numbers, state identification numbers, financial account information, and Protected Health Information.  While Suffolk did not release the total number of students affected by the data breach, the complaint alleges that approximately 36,000 Massachusetts residents were affected.  No information was provided about affected out-of-state residents.

Colleges and Universities are Prime Targets for Cybercriminals

Unfortunately, Suffolk’s data breach is not an outlier.  Colleges and universities present a wealth of opportunities for cyber criminals because they house massive amounts of sensitive data, including employee and student personal and financial information, medical records, and confidential and proprietary data.  Given how stolen data can be sold through open and anonymous forums on the Dark Web, colleges and universities will continue to remain prime targets for cybercriminals.

Recognizing this, the FBI issued a warning for higher education institutions in March 2021, informing them that cybercriminals have been targeting institutions of higher education with ransomware attacks.  In May 2022, the FBI issued a second alert, warning that cyber bad actors continue to conduct attacks against colleges and universities.

Suffolk Allegedly Breached Data Protection Duty

In the complaint, Plaintiff alleges that Suffolk did not follow industry and government guidelines to protect student PII.  In particular, Plaintiff alleges that Suffolk’s failure to protect student PII is prohibited by the Federal Trade Commission Act, 15 U.S.C.A. § 45 and that Suffolk failed to comply with the Financial Privacy Rule of the Gramm-Leach-Bliley Act (GLBA),  15 U.S.C.A. § 6801.  Further, the suit alleges that Suffolk violated the Massachusetts Right to Privacy Law, Mass. Gen. Laws Ann. ch. 214, § 1B, as well as its common law duties.

How Much Cybersecurity is Enough?

To mitigate cyber risk, colleges and university must not only follow applicable government guidelines but also  consider following industry best practices to protect student PII.

In particular, GLBA requires a covered organization to designate a qualified individual to oversee its information security program and conduct risk assessments that continually assess internal and external risks to the security, confidentiality and integrity of personal information.  After the risk assessment, the organization must address the identified risks and document the specific safeguards intended to address those risks.  See 16 CFR § 314.4.  

Suffolk, as well as other colleges and universities, may also want to look to Massachusetts law for guidance about how to further invest in its cybersecurity program.  Massachusetts was an early leader among U.S. states when, in 2007, it enacted the “Regulations to safeguard personal information of commonwealth residents” (Mass. Gen. Laws ch. 93H § 2) (Data Security Law).  The Data Security Law – still among the most prescriptive general data security state law – sets forth a list of minimum requirements that, while not specific to colleges and universities, serves as a good cybersecurity checklist for all organizations:

  1. Designation of one or more employees responsible for the WISP.
  2. Assessments of risks to the security, confidentiality and/or integrity of organizational Information and the effectiveness of the current safeguards for limiting those risks, including ongoing employee and independent contractor training, compliance with the WISP and tools for detecting and preventing security system failures.
  3. Employee security policies relating to protection of organizational Information outside of business premises.
  4. Disciplinary measures for violations of the WISP and related policies.
  5. Access control measures that prevent terminated employees from accessing organizational Information.
  6. Management of service providers that access organizational Information as part of providing services directly to the organization, including retaining service providers capable of protecting organizational Information consistent with the Data Security Regulations and other applicable laws and requiring service providers by contract to implement and maintain appropriate measures to protect organizational Information.
  7. Physical access restrictions for records containing organizational Information and storage of those records in locked facilities, storage areas or containers.
  8. Regular monitoring of the WISP to ensure that it is preventing unauthorized access to or use of organizational Information and upgrading the WISP as necessary to limit risks.
  9. Review the WISP at least annually or more often if business practices that relate to the protection of organizational Information materially change.
  10. Documentation of responsive actions taken in connection with any “breach of security” and mandatory post-incident review of those actions to evaluate the need for changes to business practices relating to protection of organizational Information.

An organization not implementing any of these controls should consider documenting the decision-making process as a defensive measure.  In implementing these requirements and recommendations, colleges and universities can best position themselves to thwart cybercriminals and plaintiffs alike.

© Copyright 2023 Squire Patton Boggs (US) LLP

D.C. District Court Limits the HIPAA Privacy Rule Requirement for Covered Entities to Provide Access to Records

On January 23, 2020, the D.C. District Court narrowed an individual’s right to request that HIPAA covered entities furnish the individual’s own protected health information (“PHI”) to a third party at the individuals’ request, and removed the cap on the fee covered entities may charge to transmit that PHI to a third party.

Specifically the Court stated that individuals may only direct PHI in an electronic format to such third parties, and that HIPAA covered entities, and their business associates, are not subject to reasonable, and cost-based fees for PHI directed to third parties.

The HIPAA Privacy Rule grants individuals with rights to access their PHI in a designated record set, and it specifies the data formats and permissible fees that HIPAA covered entities (and their business associates) may charge for such production. See 45 C.F.R. § 164.524. When individuals request copies of their own PHI, the Privacy Rule permits a HIPAA covered entity (or its business associate) to charge a reasonable, cost-based fee, that excludes, for example, search and retrieval costs. See 45 C.F.R. § 164.524(c) (4). But, when an individual requests his or her own PHI to be sent to a third party, both the required format of that data (electronic or otherwise) and the fees that a covered entity may charge for that service have been the subject of additional OCR guidance over the years—guidance that the D.C. District Court has now, in part, vacated.

The Health Information Technology for Economic and Clinical Health Act of 2009 (HITECH Act set a statutory cap on the fee that a covered entity may charge an individual for delivering records in an electronic form. 42 U.S.C. § 17935(e)(3). Then, in the 2013 Omnibus Rule, developed pursuant to Administrative Procedure Act rulemaking, the Department of Health and Human Services, Office for Civil Rights (“HHS OCR”) implemented the HITECH Act statutory fee cap in two ways. First, OCR determined that the fee cap applied regardless of the format of the PHI—electronic or otherwise. Second, OCR stated the fee cap also applied if the individual requested that a third party receive the PHI. 78 Fed. Reg. 5566, 5631 (Jan. 25, 2013). Finally, in its 2016 Guidance document on individual access rights, OCR provided additional information regarding these provisions of the HIPAA Privacy Rule. OCR’s FAQ on this topic is available here.

The D.C. District Court struck down OCR’s 2013 and 2016 implementation of the HITECH Act, in part. Specifically, OCR’s 2013 HIPAA Omnibus Final Rule compelling delivery of protected health information (PHI) to third parties regardless of the records’ format is arbitrary and capricious insofar as it goes beyond the statutory requirements set by Congress. That statute requires only that covered entities, upon an individual’s request, transmit PHI to a third party in electronic form. Additionally, OCR’s broadening of the fee limitation under 45 C.F.R. § 164.524(c)(4) in the 2016 Guidance document titled “Individuals’ Right under HIPAA to Access their Health Information 45 C.F.R. Sec. 164.524” violates the APA, because HHS did not follow the requisite notice and comment procedure.” Ciox Health, LLC v. Azar, et al., No. 18-cv0040 (D.D.C. January 23, 2020).

All other requirements for patient access remain the same, including required time frames for the provision of access to individuals, and to third parties designated by such individuals. It remains to be seen, however, how HHS will move forward after these developments from a litigation perspective and how this decision will affect other HHS priorities, such as interoperability and information blocking.


© Polsinelli PC, Polsinelli LLP in California

For more on HIPAA Regulation, see the National Law Review Health Law & Managed Care section.

My Business Is In Arizona, Why Do I Care About California Privacy Laws? How the CCPA Impacts Arizona Businesses

Arizona businesses are not typically concerned about complying with the newest California laws going into effect. However, one California law in particular—the CCPA or California Consumer Privacy Act—has a scope that extends far beyond California’s border with Arizona. Indeed, businesses all over the world that have customers or operations in California must now be mindful of whether the CCPA applies to them and, if so, whether they are in compliance.

What is the CCPA?

The CCPA is a comprehensive data privacy regulation enacted by the California Legislature that became effective on January 1, 2020. It was passed on September 13, 2018 and has undergone a series of substantive amendments over the past year and a few months.

Generally, the CCPA gives California consumers a series of rights with respect to how companies acquire, store, use, and sell their personal data. The CCPA’s combination of mandatory disclosures and notices, rights of access, rights of deletion, statutory fines, and threat of civil lawsuits is a significant move towards empowering consumers to control their personal data.

Many California businesses are scrambling to implement the necessary policies and procedures to comply with the CCPA in 2020. In fact, you may have begun to notice privacy notices on the primary landing page for national businesses. However, Arizona businesses cannot assume that the CCPA stops at the Arizona border.

Does the CCPA apply to my business in Arizona?

The CCPA has specific criteria for whether a company is considered a California business. The CCPA applies to for-profit businesses “doing business in the State of California” that also:

  • Have annual gross revenues in excess of twenty-five million dollars; or
  • Handle data of more than 50,000 California consumers or devices per year; or
  • Have 50% or more of revenue generated by selling California consumers’ personal information

The CCPA does not include an express definition of what it means to be “doing business” in California. While it will take courts some time to interpret the scope of the CCPA, any business with significant sales, employees, property, or operations in California should consider whether the CCPA might apply to them.

How do I know if I am collecting a consumer’s personal information?

“Personal information” under the CCPA generally includes any information that “identifies, relates to, describes, is capable of being associated with, or could reasonably be linked” with a specific consumer. As the legalese of this definition implies, “personal information” includes a wide swath of data that your company may already be collecting about consumers.

There is no doubt that personal identifiers like name, address, email addresses, social security numbers, etc. are personal information. But information like biometric data, search and browsing activity, IP addresses, purchase history, and professional or employment-related information are all expressly included under the CCPA’s definition. Moreover, the broad nature of the CCPA means that other categories of data collected—although not expressly identified by the CCPA—may be deemed to be “personal information” in an enforcement action.

What can I do to comply with the CCPA?

If the CCPA might apply to your company, now is the time to take action. Compliance will necessarily be different for each business depending on the nature of its operation and the use(s) of personal information. However, there are some common steps that each company can take.

The first step towards compliance with the CCPA is understanding what data your company collects, how it is stored, whether it is transferred or sold, and whether any vendors or subsidiaries also have access to the data. Next, an organization should prepare a privacy notice that complies with the CCPA to post on its website and include in its app interface.

The most substantial step in complying with the CCPA is to develop and implement policies and procedures that help the company conform to the various provisions of the CCPA. The policies will need to provide up-front disclosures to consumers, allow consumers to opt-out, handle consumer requests to produce or delete personal information, and guard against any perceived discrimination against consumers that exercise rights under the CCPA.

The company will also need to review contracts with third-party service providers and vendors to ensure it can comply with the CCPA. For example, if a third-party cloud service will be storing personal information, the company will want to verify that its contract allows it to assemble and produce that information within statutory deadlines if requested by a consumer.

At least you have some time!

The good news is that the CCPA includes a grace period until July 1, 2020 before the California Attorney General can bring enforcement actions. Thus, Arizona businesses that may have ignored the quirky California privacy law to this point have a window to bring their operations into compliance. However, Arizona companies that may need to comply with the CCPA should consult with counsel as soon as possible to begin the process. The attorneys at Ryley Carlock & Applewhite are ready to help you analyze your risk and comply with the CCPA.


Copyright © 2020 Ryley Carlock & Applewhite. A Professional Association. All Rights Reserved.

Learn more about the California Consumer Privacy Act (CCPA) on the National Law Review Communications, Media & Internet Law page.

EU-US Privacy Shield to Launch August 1, Replacing Safe Harbor

general data protection privacy shieldI. Introduction: Privacy Shield to Go Live August 1 (at Last)

The replacement for Safe Harbor is finally in effect, over nine months after Safe Harbor was struck down by the Court of Justice of the EU in the Schrems case. As most readers will be aware, Privacy Shield provides an important legal mechanism for transferring personal information from the EU to the US. The Department of Commerce (Commerce) has promised to launch a Privacy Shield website on August 1, 2016 that will allow companies to certify compliance with Privacy Shield.

The Privacy Shield documents are comprised of a 44-page “Adequacy Decision” and 104 pages of “Annexes” that contain key information concerning Privacy Shield’s standards and enforcement mechanisms. Companies that are considering certifying under Privacy Shield should review the entire Adequacy Decision and its Annexes, as well as the promised FAQs and other documents that the Department of Commerce will provide on the new Privacy Shield website. A good starting point for companies is Annex II, which contains the essential Privacy Shield “Principles” and a set of “Supplemental Principles” that clarify certain points and provide useful examples for putting Privacy Shield into practice.

Our summary aims to highlight key points and provide a basic roadmap as companies start to get to grips with the new Privacy Shield requirements.

II. Privacy Shield Principles

The Principles set out in Privacy Shield will be largely familiar to companies that had certified under Safe Harbor, but Privacy Shield contains a lot more detail and occasionally demands more stringent standards and actions than Safe Harbor.

1. Notice. Notice must be provided as soon as possible to the individual – preferably at the time the individual is asked to provide personal information. Notice must be given in “clear and conspicuous language.” The company must tell the individual that it participates in Privacy Shield, and must link to the Privacy Shield list that will be published on the Web by Commerce. The company must tell individuals what types of personal information are being collected, for what purposes, and with whom it may be shared. Individuals must be told how to make complaints to the company and its options for resolving disputes (which the company must select from a menu of limited alternatives, as discussed further below). The company must inform the individual of the company’s obligation to disclose personal information in response to lawful requests by public authorities, including for national security or law enforcement. A new requirement calls for the company to describe its liability with regard to transfers of the personal information to third parties (also discussed further below).

2. Choice. Choice comes into play primarily when the data controller wants to disclose personal information to a third party (other than agents under a contract) or use it for a purpose that is materially different than the purpose for which it was collected (which would have been communicated to the individual under the Notice principle). In many instances, consent can be obtained on an opt-out basis, provided that the new use or transfer has been disclosed clearly and conspicuously, and the individual is given a “readily available” means to exercise her choice. Critically, however, the transfer and processing of “sensitive” information requires the affirmative express consent of the individual, subject to a short list of exceptions described in the Supplemental Principles. An opt-out is not sufficient for sensitive information, which includes medical/health, race/ethnicity, political opinions, religious or philosophical beliefs, trade union membership, and information about sexuality. (As before, financial information is not considered sensitive, but companies should recall that risk-based security measures still need to be taken even if opt-out consent is used.)

3. Accountability for Onward Transfer. This Principle contains  some key differences from Safe Harbor and should be carefully reviewed by companies looking at Privacy Shield. Privacy Shield has tightened up the requirements for transferring personal information to a third party who acts as a data controller. It is not possible simply to rely on the transferee being Privacy Shield-certified. The transferor company must enter into a contract with the transferee company that specifies that the information will only be processed for “limited and specified purposes consistent with the consent provided by the individual” and that the transferee will comply with the Principles across the board. If the transferee is acting as the transferor’s agent (i.e., as a “data processor” in EU terminology) then the transferor must also take “reasonable and appropriate steps” to ensure that the transferee is processing the personal information consistently with the Principles. In all cases, the transferee must agree to notify the transferor if the transferee can no longer meet its privacy obligations. Commerce can request a summary or copy of the privacy provisions of a company’s contracts with its agents.

4. Security. The standard for data security is “reasonable and appropriate measures” to protect personal data from being compromised, taking into account the nature of the personal information that is being stored. It’s strongly implied that companies need to perform a risk assessment in order to determine precisely what measures would be reasonable and appropriate. The risk assessment and security measures should be documented in the event of an investigation or audit, and for purposes of the required annual internal review.

5. Data Integrity and Purpose Limitation. Indiscriminate collection of personal information is not permitted under Privacy Shield. Instead, personal information should be gathered for particular purposes, and only information that is relevant to those purposes can be collected. It’s not always possible to anticipate every purpose for which certain personal information might be used, so Privacy Shield allows use for additional purposes that are “not incompatible with the purpose for which it has been collected or subsequently authorized by the individual.” The benchmark for compatible processing is “the expectations of a reasonable person given the context of the collection.” Generally speaking, processing personal information for common business risk-mitigation reasons, such as anti-fraud and security purposes, will be compatible with the original purpose. Personal information cannot be retained for longer than it is needed to perform the processing that is permitted under this Principle. Additionally, companies have an affirmative obligation to take “reasonable steps” to ensure that the personal information they collect and store is “reliable for its intended use, accurate, complete, and current.” These requirements imply that periodic data cleaning may be necessary for uses that extend over a significant period of time.

6. Access. Individuals have the right to know what personal information a company holds concerning them, and to have the information corrected if it is inaccurate, or deleted if it has been processed in violation of the Privacy Shield Principles. There are a couple of exceptions: If the expense providing access is disproportionate to the risks to the individual’s privacy, or if another person’s rights would be violated by giving access, then a company can decline. Companies should use this option sparingly and document its reasons for refusing any access requests.

7. Recourse, Enforcement & Liability. One of the EU Commission’s main objectives in negotiating Privacy Shield was to ensure that the program had sharper teeth than Safe Harbor. Privacy Shield features more proactive enforcement by Commerce and the FTC, and aggrieved individuals who feel their complaints haven’t been satisfactorily resolved can bring the weight of their local DPA and Commerce to bear on the offending company. We describe the recourse, enforcement and liability requirements below in a separate section.

III. Privacy Shield Supplemental Principles

The Supplemental Principles in Annex 2 elaborate on some of the basic Principles (summarized above) and, in some cases, qualify companies’ obligations. The summary below highlights some significant points – but again, companies should read the Supplemental Principles in full to appreciate some of the nuances of the Privacy Shield requirements.

1. Sensitive Personal Data. This section sets out some exceptions to the affirmative opt-in consent requirement that mirror the exceptions in the EU Data Protection Directive.

2. Journalistic Exceptions. Privacy Shield acknowledges the significance of the First Amendment in US law. Personal information that is gathered for journalistic purposes, including from published media sources, is not subject to Privacy Shield’s requirements.

3. Secondary Liability (of ISPs, etc.) Companies acting as mere conduits of personal information, such as ISPs and telecoms providers, are not required to comply with Privacy Shield with regard to the data that travels over their networks.

4. Due Diligence and Audits. Companies performing due diligence and audits are not required to notify individuals whose personal information is processed incidental to the diligence exercise or audit. Security requirements and purpose limitations would still apply.

5. Role of the Data Protection Authorities. The Supplemental Principles describe the role of the DPA panels and the DPAs generally in greater detail. As discussed above, companies processing their own human resources information will be required to cooperate directly with the DPAs, and the Supplemental Principles seem to imply that cooperation includes designating the DPA Panels as those companies’ independent recourse mechanism. In addition to the fees attendant on this choice (capped at $500/year), companies will have to pay translation costs relating to any complaints against them.

6. Self-certification. This section outlines what the self-certification process should look like when the Privacy Shield enrollment website launches. It also contains information about what will happen when a Privacy Shield participant decides to leave the program.

7. Verification. Privacy Shield-certified companies must back up their claims with documentation. We discuss this further in the section below on enforcement.

8. Access. This section describes access requirements in more detail and also gives some guidance as to when access requests can be refused.

9. Human Resources Data. Companies planning to use Privacy Shield for the transfer of EU human resources data will want to review this section carefully. Privacy Shield does not replace or relieve companies from EU employment law obligations. Looking beyond the overseas transfer element, it’s critical to ensure that employee personal information has been collected and is processed in full compliance with applicable EU laws concerning employees.

10. Contracts for Onward Transfers.  US companies are sometimes unaware that all EU data controllers are required to have data processing contracts in place with any data processor, regardless of the processor’s location. Participation in Privacy Shield, by itself, is not enough. If a Privacy Shield-certified data controller wants to transfer the EU-origin personal information to another data controller, it can do so under a contract that requires the transferee to provide the same level of protection as Privacy Shield, except that the transferee can designate an independent recourse mechanism that is not one of the Privacy Shield-specific mechanisms. Companies will need to review their existing and new contracts carefully.

11. Dispute Resolution and Enforcement. We discuss this separately below.

12. Choice – Timing of Opt Out (Direct Marketing). This section focuses on opt-out consent for direct marketing. Companies should provide opt-out choices on all direct marketing communications. The guidance states that “an organization may use information for certain direct marketing purposes when it is impracticable to provide the individual with an opportunity to opt out before using the information, if the organization promptly gives the individual such opportunity at the same time (and upon request at any time) to decline (at no cost to the individual) to receive any further direct marketing communications and the organization complies with the individual’s wishes.” However, companies should keep in mind that the European standard for impracticability here may be tougher than we would expect in the US. In particular, US companies should consider EU requirements for direct marketing via e-mail or text, which typically requires advance consent unless the marketing is to an existing customer and is for goods or services that are similar to the ones previously purchased by the customer.

13. Travel Information. Common sense prevails with regard to travel data – when travel arrangements are being made for an EU employee or customer, the data transfer can take place outside of the Privacy Shield requirements if the customer has given “unambiguous consent” or if the transfer is necessary to fulfill contractual obligations to the customer (including the terms of frequent flyer programs).

14. Pharmaceutical and Medical Products. Pharma companies will want to review the fairly lengthy discussion of how Privacy Shield applies to clinical studies, regulatory compliance, adverse event monitoring and reporting, and other issues specific to the pharma industry. Privacy Shield is broadly helpful – and in some respects clearer than the pending GDPR.

15. Public Record and Publicly Available Information. Some, but not all, of the Principles apply to information obtained from public records or other public sources, subject to various caveats that make this section important to read in full.

16. Access Requests by Public Authorities. Privacy Shield companies have the option of publishing statistics concerning requests by US public authorities for access to EU personal information. However, publishing such statistics is not mandatory.

III. Recourse, Enforcement and Liability

A significant change in Privacy Shield from Safe Harbor is the addition of specific mechanisms for recourse and dispute resolution. One of the major perceived failings of Safe Harbor was that EEA citizens had no reasonable means to obtain relief or even to lodge a complaint. In order to satisfactorily self-certify, US companies will need to put processes in place to handle complaints.

Under Privacy Shield, at a minimum, such recourse mechanisms must include:

1. Independent Investigation and Resolution of Complaints: Readily available independent recourse mechanisms by which each individual’s complaints and disputes are investigated and expeditiously resolved at no cost to the individual … and damages awarded where the applicable law or private-sector initiatives provide;

2. Verification that You Do What You Say: Follow-up procedures for verifying that the attestations and assertions organizations make about their privacy practices are true and that privacy practices have been implemented as presented, and in particular, with regard to cases of non-compliance; and

3. You Must Fix the Problems: Obligations to remedy problems arising out of failure to comply with the Principles by organizations announcing their adherence to them and consequences for such organizations. Sanctions must be sufficiently rigorous to ensure compliance by organizations.

Prompt response to complaints is required and if a company uses an EU Data Protection Authority as a third party recourse mechanism and fails to comply with its advice within 25 days, the DPA may refer the matter to the FTC and the FTC has agreed to give priority consideration to all referrals of non-compliance from EU DPAs.

The verification requirement is more robust than under Safe Harbor. Companies may choose to either self-assess such verification or engage outside compliance reviews. Self-assessment includes certifying that its policies comply with the Principles and that it has procedures in place for training, disciplining misconduct and responding to complaints. Both outside compliance reviews and self-assessment must be conducted once a year.

Privacy Shield certifying organizations have responsibility for onward transfers and retains liability under the Principles if its third party processor violates the Principles, with some exceptions. Third party vendor management and contractual requirements for compliance with the Principles will be important components to manage the risk.

Dispute Resolution

There is ample ground for operational confusion under Privacy Shield, but none more so than with respect to dispute resolution. There are multiple methods available to data subjects (individuals) to lodge complaints, and companies subscribing to Privacy Shield must be prepared to respond through any of those. When companies certify under Privacy Shield, they need to choose an independent enforcement and dispute resolution mechanism. The choices are either:

  • Data Protection Authority Panels
  • Independent Recourse Mechanism

a. IndividualsIndividual data subjects may raise any concerns or complaints to the company itself, which is obligated to respond within 45 days. Individuals also have the option of working through their local DPA, which may in turn contact the company and/or the Department of Commerce to resolve the dispute.

b. Independent RecourseAs discussed above, the Privacy Shield requires that entities provide an independent recourse mechanism, either a private sector alternative dispute resolution provider (such as the American Arbitration Association, BBB, or TRUSTe) or a panel of European DPAs. NOTE THAT THE DPA PANEL IS MANDATORY IF YOU ARE APPLYING TO PRIVACY SHIELD TO PROCESS/TRANSFER HR DATA. For disputes involving HR data that are not resolved internally by the company (or any applicable trade union grievance procedures) to the satisfaction of the employee, the company must direct the employee to the DPA in the jurisdiction where the employee works.

c. Binding ArbitrationA Privacy Shield Panel will be composed of one or three independent arbitrators admitted to practice law in the US, with expertise in US and EU privacy law. Appeal to the Panel is open to individuals who have raised complaints with the organization, used the independent recourse mechanism, and/or sought relief through their DPA, but whose complaint is still fully or partially unresolved. The Panel can only impose equitable relief, such as access or correction. Arbitrations should be concluded within 90 days. Further, both parties may seek judicial review of the arbitral decision under the US Federal Arbitration Act.

Enforcement

In addition to the above discussion on the multiple avenues available to data subjects for complaints, there are other expanded types of enforcement under Privacy Shield. A certifying organization’s compliance may be directly or indirectly monitored by the US Department of Commerce, the FTC (or Department of Transportation), EU DPAs, and private sector independent recourse mechanisms or other privacy self-regulatory bodies.

Privacy Shield brings an expanded role to the Department of Commerce for monitoring and supervising compliance. If you have following Safe Harbor, one of the EU grounds for disapproval was the apparent lack of actual enforcement by US regulatory authorities against self-certifying organizations. The Department of Commerce has committed to a larger role and has greatly increased the size of the program staff.

Some of the new responsibilities of the Department of Commerce under Privacy Shield include:

  • Serving as a liaison between organizations and DPAs for Privacy Shield compliance issues;
  • Conducting searches for false claims by organizations that have never participated in the program and taking the aforementioned corrective action when such false claims are found.
  • Conducting ex officio investigations of those who withdraw from the program or fail to recertify to verify that such organizations are not making any false claims regarding their participation. In the event that it finds any false claims, it will first issue a warning, and then, if the matter is not resolved, refer the matter to the appropriate regulator for enforcement action; and
  • Conducting periodic ex officio compliance reviews which will include sending questionnaires to participating organizations to identify issues that may warrant further follow up action. In particular, such reviews will take place when the Department has received complaints about the organization’s compliance, the organization does not respond satisfactorily to its inquiries and information requests, or there is “credible” evidence that the organization does not comply with its commitments. Organizations will be required to provide a copy of the privacy provisions in their service provider contracts upon request. The Department of Commerce will consult with the appropriate DPAs when necessary;
  • Verifying self-certification requirements by evaluating, among other things, the organization’s privacy policy for the required elements and verifying the organization’s registration with a dispute resolution provider;

Private sector independent recourse mechanisms will have a duty to actively report organizations’ failures to comply with their rulings to the Department of Commerce. Upon receipt of such notification, the Department will remove the organization from the Privacy Shield List.

The above overview illustrates the complexity of Privacy Shield vs. Safe Harbor and the multiplication of authorities in charge of oversight, all of which is likely to result in greater regulatory scrutiny of and compliance costs for participating organizations. By way of contrast, when an organization relies on alternative transfer mechanisms such as the Standard Clauses, the regulatory oversight is performed by EU regulators against the EU company (as data exporter). Therefore, before settling on a transfer mechanism, organizations will want to consider the regulatory involvement and compliance costs associated with each option.

IV. Choosing Your Next Steps

Privacy Shield may not appeal to all US companies. Privacy Shield allows for a degree of flexibility in handling new data flows. However, that comes at the costs of fees, rigorous internal reviews and arguably much more onerous audits and enforcement than the two main alternatives, Binding Corporate Rules for intra-group transfers, and Standard Clauses for controller-to-controller or controller-to-processor transfers (regardless of corporate affiliation). Data transfers within corporate groups may be better addressed by Binding Corporate Rules that speak specifically to the groups’ global privacy practices – or even by the Standard Clauses, particularly for smaller corporations with only a few affiliates. Even outside corporate groups, the Standard Clauses may be adequate if the data flows are straightforward and unlikely to change much over time. An important point to note is that, in comparison to Safe Harbor, Privacy Shield requires more detailed company-to-company contracts when personal information is to be transferred – it’s no longer enough that both companies participate in the program. US companies should consider the potential operational benefits of Privacy Shield against its increased burdens.

It is important to consider timing. The Commerce Department Privacy Shield website will be “open for business” as of August 1. Lest you despair about the possibility of analyzing and updating those contracts that implicate the Accountability for Onward Transfer Principle in order to certify to Privacy Shield, Annex II has provided a bit of a “grace period” for what have been called early joiners.

The Privacy Principles apply immediately upon certification. Recognizing that the Principles will impact commercial relationships with third parties, organizations that certify to the Privacy Shield Framework in the first two months following the Framework’s effective date shall bring existing commercial relationships with third parties into conformity with the Accountability for Onward Transfer Principle as soon as possible, and in any event no later than nine months from the date upon which they certify to the Privacy Shield. During that interim period, where organizations transfer data to a third party, they shall (i) apply the Notice and Choice Principles, and (ii) where personal data is transferred to a third party acting as an agent, ascertain that the agent is obligated to provide at least the same level of protection as is required by the Principles.

If your company determines that Privacy Shield is the right choice, and you are diligent about the ground work required to accurately certify before that two-month window closes, you will be able to take advantage of the nine-month grace period to get those third party relationships into line.

Finally, US companies should stay alert to the legal challenges that the Standard Clauses are currently facing (again driven by concerns about mass surveillance), the possibility that EU regulators may start exacting further commitments when approving BCRs, and the very high likelihood that new legal challenges will be mounted against Privacy Shield shortly after it is implemented. Even if a company adopts Privacy Shield, or instead elects to stick with the Standard Clauses, it may want to get ready to switch if one or the other is struck down by the Court of Justice of the EU. Of course, if the Court of Justice strikes down both Privacy Shield and the Standard Clauses, it will be back to the drawing board for EU and US government negotiators.