Consumer Privacy Update: What Organizations Need to Know About Impending State Privacy Laws Going into Effect in 2024 and 2025

Over the past several years, the number of states with comprehensive consumer data privacy laws has increased exponentially from just a handful—California, Colorado, Virginia, Connecticut, and Utah—to up to twenty by some counts.

Many of these state laws will go into effect starting Q4 of 2024 through 2025. We have previously written in more detail on New Jersey’s comprehensive data privacy law, which goes into effect January 15, 2025, and Tennessee’s comprehensive data privacy law, which goes into effect July 1, 2025. Some laws have already gone into effect, like Texas’s Data Privacy and Security Act, and Oregon’s Consumer Privacy Act, both of which became effective July of 2024. Now is a good time to take stock of the current landscape as the next batch of state privacy laws go into effect.

Over the next year, the following laws will become effective:

  1. Montana Consumer Data Privacy Act (effective Oct. 1, 2024)
  2. Delaware Personal Data Privacy Act (effective Jan. 1, 2025)
  3. Iowa Consumer Data Protection Act (effective Jan. 1, 2025)
  4. Nebraska Data Privacy Act (effective Jan. 1, 2025)
  5. New Hampshire Privacy Act (effective Jan. 1, 2025)
  6. New Jersey Data Privacy Act (effective Jan. 15, 2025)
  7. Tennessee Information Protection Act (effective July 1, 2025)
  8. Minnesota Consumer Data Privacy Act (effective July 31, 2025)
  9. Maryland Online Data Privacy Act (effective Oct. 1, 2025)

These nine state privacy laws contain many similarities, broadly conforming to the Virginia Consumer Data Protection Act we discussed here.  All nine laws listed above contain the following familiar requirements:

(1) disclosing data handling practices to consumers,

(2) including certain contractual terms in data processing agreements,

(3) performing risk assessments (with the exception of Iowa); and

(4) affording resident consumers with certain rights, such as the right to access or know the personal data processed by a business, the right to correct any inaccurate personal data, the right to request deletion of personal data, the right to opt out of targeted advertising or the sale of personal data, and the right to opt out of the processing sensitive information.

The laws contain more than a few noteworthy differences. Each of the laws differs in terms of the scope of their application. The applicability thresholds vary based on: (1) the number of state residents whose personal data the company (or “controller”) controls or processes, or (2) the proportion of revenue a controller derives from the sale of personal data. Maryland, Delaware, and New Hampshire each have a 35,000 consumer processing threshold. Nebraska, similar to the recently passed data privacy law in Texas, applies to controllers that that do not qualify as small business and process personal data or engage in personal data sales. It is also important to note that Iowa adopted a comparatively narrower definition of what constitutes as sale of personal data to only transactions involving monetary consideration. All states require that the company conduct business in the state.

With respect to the Health Insurance Portability and Accountability Act of 1996 (“HIPAA”), Iowa’s, Montana’s, Nebraska’s, New Hampshire’s, and Tennessee’s laws exempt HIPAA-regulated entities altogether; while Delaware’s, Maryland’s, Minnesota’s, and New Jersey’s laws exempt only protected health information (“PHI”) under HIPAA. As a result, HIPAA-regulated entities will have the added burden of assessing whether data is covered by HIPAA or an applicable state privacy law.

With respect to the Gramm-Leach-Bliley Act (“GLBA”), eight of these nine comprehensive privacy laws contain an entity-level exemption for GBLA-covered financial institutions. By contrast, Minnesota’s law exempts only data regulated by GLBA. Minnesota joins California and Oregon as the three state consumer privacy laws with information-level GLBA exemptions.

Not least of all, Maryland’s law stands apart from the other data privacy laws due to a number of unique obligations, including:

  • A prohibition on the collection, processing, and sharing of a consumer’s sensitive data except when doing so is “strictly necessary to provide or maintain a specific product or service requested by the consumer.”
  • A broad prohibition on the sale of sensitive data for monetary or other valuable consideration unless such sale is necessary to provide or maintain a specific product or service requested by a consumer.
  • Special provisions applicable to “Consumer Health Data” processed by entities not regulated by HIPAA. Note that “Consumer Health Data” laws also exist in Nevada, Washington, and Connecticut as we previously discussed here.
  • A prohibition on selling or processing minors’ data for targeted advertising if the controller knows or should have known that the consumer is under 18 years of age.

While states continue to enact comprehensive data privacy laws, there remains the possibility of a federal privacy law to bring in a national standard. The American Privacy Rights Act (“APRA”) recently went through several iterations in the House Committee on Energy and Commerce this year, and it reflects many of the elements of these state laws, including transparency requirements and consumer rights. A key sticking point, however, continues to be the broad private right of action included in the proposed APRA but absent from all state privacy laws. Only California’s law, which we discussed here, has a private right of action, although it is narrowly circumscribed to data breaches.  Considering the November 2024 election cycle, it is likely that federal efforts to create a comprehensive privacy law will stall until the election cycle is over and the composition of the White House and Congress is known.

Montana Passes 9th Comprehensive Consumer Privacy Law in the U.S.

On May 19, 2023, Montana’s Governor signed Senate Bill 384, the Consumer Data Privacy Act. Montana joins California, Colorado, Connecticut, Indiana, Iowa, Tennessee, Utah, and Virginia in enacting a comprehensive consumer privacy law. The law is scheduled to take effect on October 1, 2024.

When does the law apply?

The law applies to a person who conducts business in the state of Montana and:

  • Controls or processes the personal data of not less than 50,000 consumers (defined as Montana residents), excluding data controlled or processed solely to complete a payment transaction.
  • Controls and processes the personal data of not less than 25,000 consumers and derive more than 25% of gross revenue from the sale of personal data.

Hereafter these covered persons are referred to as controllers.

The following entities are exempt from coverage under the law:

  • Body, authority, board, bureau, commission, district, or agency of this state or any political subdivision of this state;
  • Nonprofit organization;
  • Institution of higher education;
  • National securities association that is registered under 15 U.S.C. 78o-3 of the federal Securities Exchange Act of 1934;
  • A financial institution or an affiliate of a financial institution governed by Title V of the Gramm- Leach-Bliley Act;
  • Covered entity or business associate as defined in the privacy regulations of the federal Health Insurance Portability and Accountability Act (HIPAA);

Who is protected by the law?

Under the law, a protected consumer is defined as an individual who resides in the state of Montana.

However, the term consumer does not include an individual acting in a commercial or employment context or as an employee, owner, director, officer, or contractor of a company partnership, sole proprietorship, nonprofit, or government agency whose communications or transactions with the controller occur solely within the context of that individual’s role with the company, partnership, sole proprietorship, nonprofit, or government agency.

What data is protected by the law?

The statute protects personal data defined as information that is linked or reasonably linkable to an identified or identifiable individual.

There are several exemptions to protected personal data, including for data protected under HIPAA and other federal statutes.

What are the rights of consumers?

Under the new law, consumers have the right to:

  • Confirm whether a controller is processing the consumer’s personal data
  • Access Personal Data processed by a controller
  • Delete personal data
  • Obtain a copy of personal data previously provided to a controller.
  • Opt-out of the processing of the consumer’s personal data for the purpose of targeted advertising, sales of personal data, and profiling in furtherance of solely automated decisions that produce legal or similarly significant effects.

What obligations do businesses have?

The controller shall comply with requests by a consumer set forth in the statute without undue delay but no later than 45 days after receipt of the request.

If a controller declines to act regarding a consumer’s request, the business shall inform the consumer without undue delay, but no later than 45 days after receipt of the request, of the reason for declining.

The controller shall also conduct and document a data protection assessment for each of their processing activities that present a heightened risk of harm to a consumer.

How is the law enforced?

Under the statute, the state attorney general has exclusive authority to enforce violations of the statute. There is no private right of action under Montana’s statute.

Jackson Lewis P.C. © 2023

For more Privacy Legal News, click here to visit the National Law Review.

First BIPA Trial Results in $228M Judgment for Plaintiffs

Businesses defending class actions under the Illinois Biometric Information Privacy Act (BIPA) have struggled to defeat claims in recent years, as courts have rejected a succession of defenses.

We have been following this issue and have previously reported on this trend, which continued last week in the first BIPA class action to go to trial. The Illinois federal jury found that BNSF Railway Co. violated BIPA, resulting in a $228 million award to a class of more than 45,000 truck drivers.

Named plaintiff Richard Rogers filed suit in Illinois state court in April 2019, and BNSF removed the case to the US District Court for the Northern District of Illinois. Plaintiff alleged on behalf of a putative class of BNSF truck drivers that BNSF required the drivers to provide biometric identifiers in the form of fingerprints and hand geometry to access BNSF’s facilities. The lawsuit alleged BNSF violated BIPA by (i) failing to inform class members their biometric identifiers or information were being collected or stored prior to collection, (ii) failing to inform class members of the specific purpose and length of term for which the biometric identifiers or information were being collected, and (iii) failing to obtain informed written consent from class members prior to collection.

In October 2019, the court rejected BNSF’s legal defenses that the class’s BIPA claims were preempted by three federal statutes governing interstate commerce and transportation: the Federal Railroad Safety Act, the Interstate Commerce Commission Termination Act, and the Federal Aviation Administration Authorization Act. The court held that BIPA’s regulation of how BNSF obtained biometric identifiers or information did not unreasonably interfere with federal regulation of rail transportation, motor carrier prices, routes, or services, or safety and security of railroads.

Throughout the case, including at trial, BNSF also argued it should not be held liable where the biometric data was collected by its third-party contractor, Remprex LLC, which BNSF hired to process drivers at the gates of BNSF’s facilities. In March 2022, the court denied BNSF’s motion for summary judgment, pointing to evidence that BNSF employees were also involved in registering drivers in the biometric systems and that BNSF gave direction to Remprex regarding the management and use of the systems. The court concluded (correctly, as it turned out) that a jury could find that BNSF, not just Remprex, had violated BIPA.

The case proceeded to trial in October 2022 before US District Judge Matthew Kennelly. At trial, BNSF continued to argue it should not be held responsible for Remprex’s collection of drivers’ fingerprints. Plaintiff’s counsel argued BNSF could not avoid liability by pleading ignorance and pointing to a third-party contractor that BNSF controlled. Following a five-day trial and roughly one hour of deliberations, the jury returned a verdict in favor of the class, finding that BNSF recklessly or intentionally violated BIPA 45,600 times. The jury did not calculate damages. Rather, because BIPA provides for $5,000 in liquidated damages for every willful or reckless violation (and $1,000 for every negligent violation), Judge Kennelly applied BIPA’s damages provision, which resulted in a judgment of $228 million in damages. The judgment does not include attorneys’ fees, which plaintiff is entitled to and will inevitably seek under BIPA.

While an appeal will almost certainly follow, the BNSF case serves as a stark reminder of the potential exposure companies face under BIPA. Businesses that collect biometric data must ensure they do so in compliance with BIPA and other biometric privacy regulations. Where BIPA claims have been asserted, companies should promptly seek outside counsel to develop a legal strategy for a successful resolution.

For more Privacy and Cybersecurity Legal News, click here to visit the National Law Review.

© 2022 ArentFox Schiff LLP

Former Uber Security Chief Found Guilty in Criminal Trial for Failure to Disclose Breach to FTC

On October 5, 2022, former Uber security chief Joe Sullivan was found guilty by a jury in U.S. federal court for his alleged failure to disclose a breach of Uber customer and driver data to the FTC in the midst of an ongoing FTC investigation into the company. Sullivan was charged with one count of obstructing an FTC investigation and one count of misprision, the act of concealing a felony from authorities.

The government alleged that in 2016, in the midst of an ongoing FTC investigation into Uber for a 2014 data breach, Sullivan learned of a new breach that affected the personal information of more than 57 million Uber customers and drivers. The hackers allegedly demanded a ransom of at least $100,000 from Uber. Instead of reporting the new breach to the FTC, Sullivan and his team allegedly paid the ransom and had the hackers sign a nondisclosure agreement. Sullivan also allegedly did not report the breach to Uber’s General Counsel.  Uber did not publicly disclose the incident or inform the FTC of the incident until 2017, when Uber’s new chief executive, Dara Khosrowshahi, joined the company.

This case is significant because it represents the first time a company executive has faced criminal prosecution related to the handling of a data breach.

For more Privacy Law news, click here to visit the National Law Review.

Copyright © 2022, Hunton Andrews Kurth LLP. All Rights Reserved.

Thailand’s Personal Data Protection Act Enters into Force

On June 1, 2022, Thailand’s Personal Data Protection Act (“PDPA”) entered into force after three years of delays. The PDPA, originally enacted in May 2019, provides for a one-year grace period, with the main operative provisions of the law originally set to come into force in 2020. Due to the COVID-19 pandemic, however, the Thai government issued royal decrees to extend the compliance deadline to June 1, 2022. 

The PDPA mirrors the EU General Data Protection Regulation (“GDPR”) in many respects. Specifically, it requires data controllers and processors to have a valid legal basis for processing personal data (i.e., data that can identify living natural persons directly or indirectly). If such personal data is sensitive personal data (such as health data, biometric data, race, religion, sexual preference and criminal record), data controllers and processors must ensure that data subjects give explicit consent for any collection, use or disclosure of such data. Exemptions are granted for public interest, contractual obligations, vital interest or compliance with the law.

The PDPA applies both to entities in Thailand and abroad that process personal data for the provision of products or services in Thailand. Like the GDPR, data subjects are guaranteed rights, including the right to be informed, access, rectify and update data; restrict and object to processing; and the right to data erasure and portability. Breaches may result in fines between THB500,000 (U.S.$14,432) and THB5 million, plus punitive compensation. Certain breaches involving sensitive personal data and unlawful disclosure also carry criminal penalties including imprisonment of up to one year.

Copyright © 2022, Hunton Andrews Kurth LLP. All Rights Reserved.

Navigating the Data Privacy Landscape for Autonomous and Connected Vehicles: Best Practices

Autonomous and connected vehicles, and the data they collect, process and store, create high demands for strong data privacy and security policies. Accordingly, in-house counsel must define holistic data privacy best practices for consumer and B2B autonomous vehicles that balance compliance, safety, consumer protections and opportunities for commercial success against a patchwork of federal and state regulations.

Understanding key best practices related to the collection, use, storage and disposal of data will help in-house counsel frame balanced data privacy policies for autonomous vehicles and consumers. This is the inaugural article in our series on privacy policy best practices related to:

  1. Data collection

  2. Data privacy

  3. Data security

  4. Monetizing data

Autonomous and Connected Vehicles: Data Protection and Privacy Issues

The spirit of America is tightly intertwined with the concept of personal liberty, including freedom to jump in a car and go… wherever the road takes you. As the famous song claims, you can “get your kicks on Route 66.” But today you don’t just get your kicks. You also get terabytes of data on where you went, when you left and arrived, how fast you traveled to get there, and more.

Today’s connected and semi-autonomous vehicles are actively collecting 100x more data than a personal smartphone, precipitating a revolution that will drive changes not just to automotive manufacturing, but to our culture, economy, infrastructure, legal and regulatory landscapes.

As our cars are becoming computers, the volume and specificity of data collected continues to grow. The future is now. Or at least, very near. Global management consultant McKinsey estimates “full autonomy with Level 5 technology—operating anytime, anywhere” as soon as the next decade.

This near-term future isn’t only for consumer automobiles and ride-sharing robo taxis. B2B industries, including logistics and delivery, agriculture, mining, waste management and more are pursuing connected and autonomous vehicle deployments.

In-house counsel must balance evolving regulations at the federal and state level, as well as consider cross-border and international regulations for global technologies. In the United States, the Federal Trade Commission (FTC) is the regulatory agency governing data privacy, alongside individual states that are developing their own regulations, with the California Consumer Privacy Act (CCPA) leading the way. Virginia and Colorado have new laws coming into effect in 2022, the California Privacy Rights Act comes into effect in 2023, and a half dozen more states are expected to enact new privacy legislation in the near future.

While federal and state regulations continue to evolve, mobility companies in the consumer and B2B mobility sectors need to make decisions today about their own data privacy and security policies in order to optimize compliance and consumer protection with opportunities for commercial success.

Understanding Types of Connected and Autonomous Vehicles

Autonomous, semi-autonomous, self-driving, connected and networked cars; in this developing category, these descriptions are often used interchangeably in leading business and industry publications. B2B International defines “connected vehicles (CVs) [as those that] use the latest technology to communicate with each other and the world around them” whereas “autonomous vehicles (AVs)… are capable of recognizing their environment via the use of on-board sensors and global positioning systems in order to navigate with little or no human input. Examples of autonomous vehicle technology already in action in many modern cars include self-parking and auto-collision avoidance systems.”

But SAE International and the National Highway Traffic Safety Administration (NHTSA) go further, defining five levels of automation in self-driving cars.

Levels of Driving Automation™ in Self-Driving Cars

 

 

Level 3 and above autonomous driving is getting closer to reality every day because of an array of technologies, including: sensors, radar, sonar, lidar, biometrics, artificial intelligence and advanced computing power.

Approaching a Data Privacy Policy for Connected and Autonomous Vehicles

Because the mobility tech ecosystem is so dynamic, many companies, though well intentioned, inadvertently start with insufficient data privacy and security policies for their autonomous vehicle technology. The focus for these early and second stage companies is on bringing a product to market and, when sales accelerate, there is an urgent need to ensure their data privacy policies are comprehensive and compliant.

Whether companies are drafting initial policies or revising existing ones, there are general data principles that can guide policy development across the lifecycle of data:

Collect

Use

Store

Dispose

Only collect the data you need

Only use data for the reason you informed the consumer

Ensure reasonable data security protections are in place

Dispose the data when it’s no longer needed

Additionally, for many companies, framing autonomous and connected vehicle data protection and privacy issues through a safety lens can help determine the optimal approach to constructing policies that support the goals of the business while satisfying federal and state regulations.

For example, a company that monitors driver alertness (critical for safety in today’s Level 2 AV environment) through biometrics is, by design, collecting data on each driver who uses the car. This scenario clearly supports vehicle and driver safety while at the same time implicates U.S. data privacy law.

In the emerging regulatory landscape, in-house counsel will continue to be challenged to balance safety and privacy. Biometrics will become even more prevalent in connection to identification and authentication, along with other driver-monitoring technologies for all connected and autonomous vehicles, but particularly in relation to commercial fleet deployments.

Developing Best Practices for Data Privacy Policies

In-house counsel at autonomous vehicle companies are responsible for constructing their company’s data privacy and security policies. Best practices should be set around:

  • What data to collect and when

  • How collected data will be used

  • How to store collected data securely

  • Data ownership and monetization

Today, the CCPA sets the standard for rigorous consumer protections related to data ownership and privacy. However, in this evolving space, counsel will need to monitor and adjust their company’s practices and policies to comply with new regulations as they continue to develop in the U.S. and countries around the world.

Keeping best practices related to the collection, use, storage and disposal of data in mind will help in-house counsel construct policies that balance consumer protections with safety and the commercial goals of their organizations.

A parting consideration may be opportunistic, if extralegal: companies that choose to advocate strongly for customer protections may be afforded a powerful, positive opportunity to position themselves as responsible corporate citizens.

© 2022 Varnum LLP
For more articles about transportation, visit the NLR Public Services, Infrastructure, Transportation section.

White House Report May Have Long-Term Effect on Consumer Privacy and How Companies Do Business

A recent White House report on consumer  data privacy forecasts a multifaceted approach to fulfilling public expectations regarding the protection of consumer’s personal information.  Although it is uncertain if the report will result in new legislation in the near future, the report could have long-term implications for the current regulatory landscape.

In February 2012 the White House released a report detailing the current administration’s position on consumer privacy, entitled Consumer Data Privacy in a Networked World: A Framework for Protecting Privacy and Promoting Innovation in the Global Digital Economy.  Although it is uncertain if the report will result in new privacy legislation in the near term, the report may still have long-term implications for the current regulatory landscape.

As explained in the report’s Executive Summary, the consumer privacy framework proposed by the administration consists of four key elements: (1) a Consumer Privacy Bill of Rights; (2) a “multistakeholder” process to specify how the principles in the Consumer Privacy Bill of Rights apply in particular business  contexts; (3) effective enforcement; and (4) a commitment to increase interoperability with the privacy frameworks of international partners. Below we examine each of these elements.

1. Consumer Privacy Bill of Rights

Building upon Fair Information Practice Principles that were first promulgated by the U.S. Department of Health, Education, and Welfare in the 1970s, the Consumer Privacy Bill of Rights is intended to affirm consumer expectations with regard to how companies handle personal data.2  Although the administration recognizes consumers have “certain responsibilities” to protect their own privacy, it also emphasizes the importance of using personal data in a manner consistent with the context in which it is collected.

In a press release accompanying the release of the report, the White House summarized the basic tenets of the Consumer Privacy Bill of Rights3:

Transparency—Consumers have a right to easily understandable information about privacy and security practices.

Respect for Context—Consumers have a right to expect that organizations will collect, use and disclose personal data in ways that are consistent with the context in which consumers provide the data.4

Security—Consumers have a right to secure and responsible handling of personal data.

Access and Accuracy—Consumers have a right to access and correct personal data in usable formats, in a manner that is appropriate to the sensitivity of the data and the risk of adverse consequences to consumers if the data are inaccurate.

Focused Collection—Consumers have a right to reasonable limits on the personal data that companies collect and retain.

Accountability—Consumers have a right to have personal data handled by companies with appropriate measures in place to assure they adhere to the Consumer Privacy Bill of Rights.

The outline for the Consumer Privacy Bill of Rights is largely aspirational, in that it does not create any enforceable obligations.  Instead, the framework simply creates suggested guidelines for companies that collect personal data as a primary, or even ancillary, function of their business operations.  As the administration recognizes, in the absence of legislation these are only “general principles that afford companies discretion in how they implement them.”5

Nevertheless, as consumers become more invested in how their personal information is used, a company that disregards the basic tenets of the Consumer Privacy Bill of Rights may be doing so at its own peril.  Although the Consumer Privacy Bill of Rights has not been codified, companies should expect that some iteration of the same principles will ultimately be legislated, or voluntarily adopted by enough industry leaders to render them enforceable by the FTC.  Therefore, companies would be welladvised to make sure they have coherent privacy policies in place now in order to avoid running afoul of guidelines imposed by whatever regulatory framework is implemented later.

2. The “Multistakeholder” Process to Develop Enforceable Codes of Conduct

The report also encourages stakeholders—described by the Administration as “companies, industry groups, privacy advocates, consumer groups, crime victims, academics, international partners, State Attorneys General, Federal civil and criminal law enforcement representatives, and other relevant groups”—to cooperate in the development of rules implementing the principles outlined in the Consumer Privacy Bill of Rights.  Of all the elements comprising the administration’s consumer privacy framework, it is this “multistakeholder” process that will likely see the most activity in coming months.

The report identifies several benefits attributable to this approach6:  First, an open process reflects the character of the internet itself as an “open, decentralized, user-driven platform for communication, innovation and economic growth.”  Second, participation of multiple stakeholders encourages flexibility, speed and creativity.  Third, this approach is likely to producesolutions “in a more timely fashion than regulatory processes and treaty-based organizations.”  Finally, the multistakeholder process allows experts to focus on specific challenges, rather than relying upon centralized authority.

The report contemplates that the multistakeholder process  will be moderated by the U.S. Department of Commerce’s National Telecommunications and Information Administration (NTIA), a view echoed by the press release accompanying the report.7  This process will likely present companies whose operations involve the collection of consumer data online—a rapidly expanding category that encompasses far more than just internet businesses—with an opportunity to shape future internet privacy legislation.

NTIA has already initiated the conversation through the issuance of a Request for Public Comments on the administration’s consumer privacy framework.8  NTIA has suggested the first topic for discussion should be a “discrete issue that allows consumers and businesses to engage [in] and conclude multistakeholder discussions in a reasonable timeframe.”9    As  one example, NTIA has suggested stakeholders discuss how the  Consumer Privacy Bill of Rights’ “transparency” principle should be applied to privacy notices for mobile applications.  When one considers that by some estimates the revenue generated by the mobile application market is expected to reach $25 billion over the next four years, it is clear that even this “discrete” issue alone could result in a significant regulatory impact.10

3. Effective Enforcement

The report further suggests that the Federal Trade Commission (FTC) will play a vital role in the enforcement of the consumer privacy protections outlined by the administration and developed during the multistakeholder process.  The administration admits, however, that in the absence of new legislation, the FTC’s authority in the area of consumer privacy may be limited to the enforcement of guidelines adopted by companies voluntarily.

According to the administration, enforcement actions “by the FTC (and State Attorneys General) have established that companies’ failures to adhere to voluntary privacy commitments, such as those stated in privacy policies, are actionable under the FTC Act’s (and State analogues) prohibition on unfair or deceptive acts or practices.”11  Therefore, in the administration’s view, the guidelines developed during the multistakeholder process would be enforceable under the existing statutory framework.

In light of the current election cycle and the resulting political landscape, it seems unlikely Congress will pass new consumer privacy legislation in the near term.  Nevertheless, companies should remain mindful that the FTC—and even state Attorneys General—may become more aggressive in addressing flagrant violations of consumers’ privacy expectations.  For instance, California’s Attorney General has explained that her office intends to enforce an agreement that California reached with Apple and other industry leaders earlier this year.  The agreement would require developers of mobile applications to post conspicuous privacy policies that explain how users’ personal information is gathered and used.

Moreover, the increased attention directed at privacy issues by consumer groups and the public at large suggests an inevitable groundswell of support for new privacy legislation.  As Jon Leibowitz, the chairman of the FTC, explained earlier this week, we could see new privacy legislation early in the term of the next Congress.12

4. A Commitment to Increased Operability

Recognizing that other countries have taken different approaches to data privacy issues, the report also encourages the development of interoperability with regulatory regimes implemented internationally.  The administration has suggested a three-pronged approach to achieving increased operability: mutual recognition, development of codes of conduct through multistakeholder processes and enforcement cooperation.

With respect to mutual recognition, the report identifies existing examples of transnational cooperation in the privacy context.  For example, it cites the Asia-Pacific Economic Cooperation’s voluntary system of Cross Border Privacy Rules and also the European Union’s Data Protection Directive.  It appears that the administration, at least for now, will depend upon companies’ voluntary adoption of these international frameworks.

Just as the administration will rely upon the multistakeholder process to develop domestic codes of conduct, it will adopt the same approach to developing globally applicable rules and guidelines.  Although the administration contemplates this process will be directed by the U.S. Departments of Commerce and State, the report does not provide any details.

Finally, the report explains the FTC will spearhead the U. S. Government’s efforts to cooperate with the FTC’s foreign counterparts in the “development of privacy enforcement priorities, sharing of best practices, and support for joint enforcement initiatives.”13


1  Report at 1. 

2  Although businesses are also “consumers,” the report appears to focus on protecting individuals’ personally identifiable information. 

3  We Can’t Wait: Obama Administration Unveils Blueprint for a “Privacy Bill of Rights” to Protect Consumers Online, February 23, 2012, Office of the Press Secretary. 

4 To illustrate the “context” principle, the report provides the example of a hypothetical social networking provider.  Users expect that certain biographical information will be collected in order to improve the service; however, if the provider sells the same biographical information to an information broker for advertising purposes, that use is more attenuated from users’ expectations.  Therefore, the latter use is not consistent with the “context” in which the biographical information was provided. 

5  Report at 2. 

6  Report at 23. 

7  We Can’t Wait, February 23, 2012, Office of the Press Secretary (“In the coming weeks, the Commerce Department’s National Telecommunications and Information Administration will convene stakeholders … .”). 

8  Docket No. 120214135-2135-01, February 29, 2012. 

9 Moving Forward with the Consumer Privacy Bill of Rights, Lawrence E. Strickling, Assistant Secretary for Communications and Information, February 29, 2012. 

10 According to Markets & Markets, a market research company and consulting firm. 

11 Report at 29. 

12 U.S. Agency Seeks Tougher Consumer Privacy Rules, The New York Times, March 26, 2012. 

13 Report at 33. 

© 2012 McDermott Will & Emery