EVERYTHING’S FINE: Big TCPA Win For Medical Debt Collector Suggests FCC Rulings Still Binding After Loper Bright–Let’s Hope it Stays That Way

Fascinating little case for you all today.

Consumer visits hospital for treatment. Provides phone number at admission. Receives treatment and is discharged.

Consumer fails to pay resulting invoices. Hospital and provider network turn account over to collections. Debt collector allegedly uses an ATDS to call consumer on the number she provided.

What result?

Prior to the Supreme Court’s Loper Bright decision the determination would be easy. The FCC held back in 2009 that providing a number in connection with a transaction permits autodialed calls to a consumer in connection with that transaction. And the Sixth Circuit Court of Appeals has directly held that providing a phone number on hospital intake documents permits later debt collection activity at that number–including via autodialer.

But the Loper Bright decision recently destroyed Chevron deference–meaning courts no longer have to yield to agency determinations of this sort. And while the Hobbs Act affords extra protections to certain FCC rulings, those protections only apply where certain procedural requirements were met by the Commission in adopting the rule.

So does the FCC’s rule from 2009 permitting informational calls to numbers provided in connection with a transaction still bind courts? According to the decision in Woodman v. Medicredit, 2024 WL 4132732 (D. Nv Sept. 9, 2024) the answer is yes!

In Woodman the defendant debt collector moved for summary judgment arguing the Plaintiff consented when she provided her number to the hospital. The Court had little problem applying the FCC’s 2009 order and precedent that came before Loper Bright to grant summary judgment to he defense. So just like that case is gone.

Great ruling for the defense, of course, and it makes me feel a bit better about the whole “no one knows what the law is anymore” thing, but the Woodman court didn’t really address the core issue– was the 2009 ruling enacted with sufficient APA pop and circumstance to merit Hobbs Act deference under PDR Resources. 

Really interesting question and one folks should keep in mind.

Legal and Privacy Considerations When Using Internet Tools for Targeted Marketing

Businesses often rely on targeted marketing methods to reach their relevant audiences. Instead of paying for, say, a television commercial to be viewed by people across all segments of society with varied purchasing interests and budgets, a business can use tools provided by social media platforms and other internet services to target those people most likely to be interested in its ads. These tools may make targeted advertising easy, but businesses must be careful when using them – along with their ease of use comes a risk of running afoul of legal rules and regulations.

Two ways that businesses target audiences are working with influencers who have large followings in relevant segments of the public (which may implicate false or misleading advertising issues) and using third-party “cookies” to track users’ browsing history (which may implicate privacy and data protection issues). Most popular social media platforms offer tools to facilitate the use of these targeting methods. These tools are likely indispensable for some businesses, and despite their risks, they can be deployed safely once the risks are understood.

Some Platform-Provided Targeted Marketing Tools May Implicate Privacy Issues
Google recently announced1 that it will not be deprecating third-party cookies, a reversal from its previous plan to phase out these cookies. “Cookies” are small pieces of code that track users’ activity online. “First-party” cookies often are necessary for the website to function properly. “Third-party” cookies are shared across websites and companies, essentially tracking users’ browsing behaviors to help advertisers target their relevant audiences.

In early 2020, Google announced2 that it would phase out third-party cookies, which are associated with privacy concerns because they track individual web-browsing activity and then share that data with other parties. Google’s 2020 announcement was a response to these concerns.

Fast forward about four and a half years, and Google reversed course. During that time, Google had introduced alternatives to third-party cookies, and companies had developed their own, often extensive, proprietary databases3 of information about their customers. However, none of these methods satisfied the advertising industry. Google then made the decision to keep third-party cookies. To address privacy concerns, Google said it would “introduce a new experience in Chrome that lets people make an informed choice that applies across their web browsing, and they’d be able to adjust that choice at any time.”4

Many large platforms in addition to Google offer targeted advertising services via the use of third-party cookies. Can businesses use these services without any legal ramifications? Does the possibility for consumers to opt out mean that a user cannot be liable for privacy concerns if it relies on third-party cookies? The relevant cases have held that individual businesses still must be careful despite any opt-out and other built-in tools offered by these platforms.

Two recent cases from the Southern District of New York5 held that individual businesses that used “Meta Pixels” to track consumers may be liable for violations of the Video Privacy Protection Act (VPPA). 19 U.S.C. § 2710. Facebook defines a Meta Pixel6 as a “piece of code … that allows you to … make sure your ads are shown to the right people … drive more sales, [and] measure the results of your ads.” In other words, a Meta Pixel is essentially a cookie provided by Meta/Facebook that helps businesses target ads to relevant audiences.

As demonstrated by those two recent cases, businesses cannot rely on a platform’s program to ensure their ad targeting efforts do not violate the law. These violations may expose companies to enormous damages – VPPA cases often are brought as class actions and even a single violation may carry damages in excess of $2,500.

In those New York cases, the consumers had not consented to sharing information, but, even if they had, the consent may not suffice. Internet contracts, often included in a website’s Terms of Service, are notoriously difficult to enforce. For example, in one of those S.D.N.Y. cases, the court found that the arbitration clause to which subscribers had agreed was not effective to force arbitration in lieu of litigation for this matter. In addition, the type of consent and the information that websites need to provide before sharing information can be extensive and complicated, as recently reportedby my colleagues.

Another issue that companies may encounter when relying on widespread cookie offerings is whether the mode (as opposed to the content) of data transfer complies with all relevant privacy laws. For example, the Swedish Data Protection Agency recently found8 that a company had violated the European Union’s General Data Protection Regulation (GDPR) because the method of transfer of data was not compliant. In that case, some of the consumers had consented, but some were never asked for consent.

Some Platform-Provided Targeted Marketing Tools May Implicate False or Misleading Advertising Issues
Another method that businesses use to target their advertising to relevant consumers is to hire social media influencers to endorse their products. These partnerships between brands and influencers can be beneficial to both parties and to the audiences who are guided toward the products they want. These partnerships are also subject to pitfalls, including reputational pitfalls (a controversial statement by the influencer may negatively impact the reputation of the brand) and legal pitfalls.

The Federal Trade Commission (FTC) has issued guidelinesConcerning Use of Endorsements and Testimonials” in advertising, and published a brochure for influencers, “Disclosures 101 for Social Media Influencers,”10 that tells influencers how they must apply the guidelines to avoid liability for false or misleading advertising when they endorse products. A key requirement is that influencers must “make it obvious” when they have a “material connection” with the brand. In other words, the influencer must disclose that it is being paid (or gains other, non-monetary benefits) to make the endorsement.

Many social media platforms make it easy to disclose a material connection between a brand and an influencer – a built-in function allows influencers to simply click a check mark to disclose the existence of a material connection with respect to a particular video endorsement. The platform then displays a hashtag or other notification along with the video that says “#sponsored” or something similar. However, influencers cannot rely on these built-in notifications. The FTC brochure clearly states: “Don’t assume that a platform’s disclosure tool is good enough, but consider using it in addition to your own, good disclosure.”

Brands that sponsor influencer endorsements may easily find themselves on the hook if the influencer does not properly disclose that the influencer and the brand are materially connected. In some cases, the contract between the brand and influencer may pass any risk to the brand. In others, the influencer may be judgement proof, or the brand is an easier target for enforcement. And, unsurprisingly, the FTC has sent warning letters11 threatening high penalties to brands for influencer violations.

The Platform-Provided Tools May Be Deployed Safely
Despite risks involved in some platform-provided tools for targeted marketing, these tools are very useful, and businesses should continue to take advantage of them. However, businesses cannot rely on these widely available and easy-to-use tools but must ensure that their own policies and compliance programs protect them from liability.

The same warning about widely available social media tools and lessons for a business to protect itself are also true about other activities online, such as using platforms’ built-in “reposting” function (which may implicate intellectual property infringement issues) and using out-of-the-box website builders (which may implicate issues under the Americans with Disabilities Act). A good first step for a business to ensure legal compliance online is to understand the risks. An attorney experienced in internet law, privacy law and social media law can help.

_________________________________________________________________________________________________________________

1 https://privacysandbox.com/news/privacy-sandbox-update/

https://blog.chromium.org/2020/01/building-more-private-web-path-towards.html

3 Businesses should ensure that they protect these databases as trade secrets. See my recent Insights at https://www.wilsonelser.com/sarah-fink/publications/relying-on-noncompete-clauses-may-not-be-the-best-defense-of-proprietary-data-when-employees-depart and https://www.wilsonelser.com/sarah-fink/publications/a-practical-approach-to-preserving-proprietary-competitive-data-before-and-after-a-hack

4 https://privacysandbox.com/news/privacy-sandbox-update/

5 Aldana v. GamesStop, Inc., 2024 U.S. Dist. Lexis 29496 (S.D.N.Y. Feb. 21, 2024); Collins v. Pearson Educ., Inc., 2024 U.S. Dist. Lexis 36214 (S.D.N.Y. Mar. 1, 2024)

6 https://www.facebook.com/business/help/742478679120153?id=1205376682832142

7 https://www.wilsonelser.com/jana-s-farmer/publications/new-york-state-attorney-general-issues-guidance-on-privacy-controls-and-web-tracking-technologies

See, e.g., https://www.dataguidance.com/news/sweden-imy-fines-avanza-bank-sek-15m-unlawful-transfer

9 https://www.ecfr.gov/current/title-16/chapter-I/subchapter-B/part-255

10 https://www.ftc.gov/system/files/documents/plain-language/1001a-influencer-guide-508_1.pd

11 https://www.ftc.gov/system/files/ftc_gov/pdf/warning-letter-american-bev.pdf
https://www.ftc.gov/system/files/ftc_gov/pdf/warning-letter-canadian-sugar.pdf

“Arbitrary and Capricious” – A Sign of Things to Come?

On July 3, 2024, the US District Court of Northern Texas issued a Memorandum Opinion and Order in the combined cases of Americans for Beneficiary Choice, et al. v. United States Department of Health and Human Services (Civ. Action No. 4:24-cv-00439) and Council for Medicare Council, et al., v. United States Department of Health and Human Services (Civ. Action No. 4:24-cv-00446).

The Plaintiffs (in this combined case) challenged the Centers for Medicare and Medicaid Services (“CMS”) rule issued earlier this year. The new rules attempt to place reimbursements to third-party firms into the definition of compensation where the prior rules did not include reimbursements into the definition of compensation which would have been subject to the regulatory cap on compensation.

This Memorandum Opinion Order granted the Plaintiffs’ Motion for a Stay in part and denied it in part. The Motion was granted in relation to the new CMS rules around compensation paid by Medicare Advantage and Part D plans to independent agents and brokers who help beneficiaries select and enroll in private plans.

The Court found that the compensation changes were arbitrary and capricious and that the Plaintiffs were substantially likely to succeed on the merits of the case. The Court found that CMS failed to substantiate key parts of the final rule. During the rulemaking process, industry commenters asked for clarification around parts of the rule, but CMS claimed “the sources Plaintiffs criticized were not significant enough to warrant defending them.” The Court found “because CMS failed to address important problems to their central evidence…that members of the public raised during the comment period, those aspects of the Final Rule are most likely arbitrary and capricious.”

One of the Plaintiffs, Americans for Beneficiary Choice, also challenged the consent requirement of the final rule. The final rule states that personal beneficiary data collected by a third party marketing organization (“TPMO”) can only be shared with another TPMO if the beneficiary gives prior express written consent. The Plaintiff argued that the consent requirement is “in tension with HIPAA’s broader purpose of facilitating data sharing” and CMS stated that HIPAA might facilitate data sharing, but that does not limit CMS’s ability to limit certain harmful data-sharing practices. The Court denied the Motion to Stay regarding the consent requirement, but interestingly stated that Plaintiff’s “claim regarding the Consent Requirement may ultimately have merit, [Plaintiff]’s current briefing does not demonstrate a substantial likelihood of success at this stage”.

What does this mean now that we are less than 90 days from the start of the 2025 Medicare Advantage/Part D contract year?

  1. The consent requirement is still moving forward – While the memorandum order hints at the possibility of it being rejected, as of right now, TPMO’s must get prior express written consent before sharing personal beneficiary data with another TPMO.
  2. The fixed-fee and contract-terms restrictions in the final rule have had their effective date’s stayed until this suit is resolved. Therefore, the compensation scheme that was in place last year is essentially the same for those two sections.

How does this affect the FCC’s 1:1 Ruling?

It doesn’t. While this case does show that courts are willing to look critically at agencies’s rulemaking process, the FCC’s 1:1 consent requirement is different than the compensation changes set forth by CMS.

The FCC arguably just clarified the existing rule around prior express written consent by requiring the consent to “authorize no more than one identified seller”.

CMS, on the other hand, attempted to make wholesale changes and “began to set fixed rates for a wide range of administrative payments that were previously uncapped and unregulated as compensation.”

There is still the IMC case against the FCC , so there is the possibility (albeit small) there could be relief coming in that case. However, the advice here is to continue planning for obtaining consent to share personal beneficiary data AND single seller consent.

House Committee Postpones Markup Amid New Privacy Bill Updates

On June 27, 2024, the U.S. House of Representatives cancelled the House Energy and Commerce Committee markup of the American Privacy Rights Act (“APRA” or “Bill”) scheduled for that day, reportedly with little notice. There has been no indication of when the markup will be rescheduled; however, House Energy and Commerce Committee Chairwoman Cathy McMorris Rodgers issued a statement reiterating her support for the legislation.

On June 20, 2024, the House posted a third version of the discussion draft of the APRA. On June 25, 2024, two days before the scheduled markup session, Committee members introduced the APRA as a bill, H.R. 8818. Each version featured several key changes from earlier drafts, which are outlined collectively, below.

Notable changes in H.R. 8818 include the removal of two key sections:

  • “Civil Rights and Algorithms,” which required entities to conduct covered algorithm impact assessments when algorithms posed a consequential risk of harm to individuals or groups; and
  • “Consequential Decision Opt-Out,” which allowed individuals to opt out of being subjected to covered algorithms.

Additional changes include the following:

  • The Bill introduces new definitions, such as “coarse geolocation information” and “online activity profile,” the latter of which refines a category of sensitive data. “Neural data” and “information that reveals the status of an individual as a member of the Armed Forces” are added as new categories of sensitive data. The Bill also modifies the definitions of “contextual advertising” and “first-party advertising.”
  • The data minimization section includes a number of changes, such as the addition of “conduct[ing] medical research” in compliance with applicable federal law as a new permitted purpose. The Bill also limits the ability to rely on permitted purposes in processing sensitive covered data, biometric and genetic information.
  • The Bill now allows not only covered entities (excluding data brokers or large data holders), but also service providers (that are not large data holders) to apply for the Federal Trade Commission-approved compliance guideline mechanism.
  • Protections for covered minors now include a prohibition on first-party advertising (in addition to targeted advertising) if the covered entity knows the individual is a minor, with limited exceptions acknowledged by the Bill. It also restricts the transfer of a minor’s covered data to third parties.
  • The Bill adds another preemption clause, clarifying that APRA would preempt any state law providing protections for children or teens to the extent such laws conflict with the Bill, but does not prohibit states from enacting laws, rules or regulations that offer greater protection to children or teens than the APRA.

For additional information about the changes, please refer to the unofficial redline comparison of all APRA versions published by the IAPP.

The Privacy Patchwork: Beyond US State “Comprehensive” Laws

We’ve cautioned before about the danger of thinking only about US state “comprehensive” laws when looking to legal privacy and data security obligations in the United States. We’ve also mentioned that the US has a patchwork of privacy laws. That patchwork is found to a certain extent outside of the US as well. What laws exist in the patchwork that relate to a company’s activities?

There are laws that apply when companies host websites, including the most well-known, the California Privacy Protection Act (CalOPPA). It has been in effect since July 2004, thus predating COPPA by 14 years. Then there are laws the apply if a company is collecting and using biometric identifiers, like Illinois’ Biometric Information Privacy Act.

Companies are subject to specific laws both in the US and elsewhere when engaging in digital communications. These laws include the US federal laws TCPA and TCFAPA, as well as CAN-SPAM. Digital communication laws exist in countries as wide ranging as Australia, Canada, Morocco, and many others. Then we have laws that apply when collecting information during a credit card transaction, like the Song Beverly Credit Card Act (California).

Putting It Into Practice: When assessing your company’s obligations under privacy and data security laws, keep activity specific privacy laws in mind. Depending on what you are doing, and in what jurisdictions, you may have more obligations to address than simply those found in comprehensive privacy laws.

American Privacy Rights Act Advances with Significant Revisions

On May 23, 2024, the U.S. House Committee on Energy and Commerce Subcommittee on Data, Innovation, and Commerce approved a revised draft of the American Privacy Rights Act (“APRA”), which was released just 36 hours before the markup session. With the subcommittee’s approval, the APRA will now advance to full committee consideration. The revised draft includes several notable changes from the initial discussion draft, including:

  • New Section on COPPA 2.0 – the revised APRA draft includes the Children’s Online Privacy Protection Act (COPPA 2.0) under Title II, which differs to a certain degree from the COPPA 2.0 proposal currently before the Senate (e.g., removal of the revised “actual knowledge” standard; removal of applicability to teens over age 12 and under age 17).
  • New Section on Privacy By Design – the revised APRA draft includes a new dedicated section on privacy by design. This section requires covered entities, service providers and third parties to establish, implement, and maintain reasonable policies, practices and procedures that identify, assess and mitigate privacy risks related to their products and services during the design, development and implementation stages, including risks to covered minors.
  • Expansion of Public Research Permitted Purpose – as an exception to the general data minimization obligation, the revised APRA draft adds another permissible purpose for processing data for public or peer-reviewed scientific, historical, or statistical research projects. These research projects must be in the public interest and comply with all relevant laws and regulations. If the research involves transferring sensitive covered data, the revised APRA draft requires the affirmative express consent of the affected individuals.
  • Expanded Obligations for Data Brokers – the revised APRA draft expands obligations for data brokers by requiring them to include a mechanism for individuals to submit a “Delete My Data” request. This mechanism, similar to the California Delete Act, requires data brokers to delete all covered data related to an individual that they did not collect directly from that individual, if the individual so requests.
  • Changes to Algorithmic Impact Assessments – while the initial APRA draft required large data holders to conduct and report a covered algorithmic impact assessment to the FTC if they used a covered algorithm posing a consequential risk of harm to individuals, the revised APRA requires such impact assessments for covered algorithms to make a “consequential decision.” The revised draft also allows large data holders to use certified independent auditors to conduct the impact assessments, directs the reporting mechanism to NIST instead of the FTC, and expands requirements related to algorithm design evaluations.
  • Consequential Decision Opt-Out – while the initial APRA draft allowed individuals to invoke an opt-out right against covered entities’ use of a covered algorithm making or facilitating a consequential decision, the revised draft now also allows individuals to request that consequential decisions be made by a human.
  • New and/or Revised Definitions – the revised APRA draft’s definition section includes new terms, such as “contextual advertising” and “first party advertising.”. The revised APRA draft also redefines certain terms, including “covered algorithm,” “sensitive covered data,” “small business” and “targeted advertising.”

On July 1, 2024, Texas May Have the Strongest Consumer Data Privacy Law in the United States

It’s Bigger. But is it Better?

They say everything is bigger in Texas which includes big privacy protection. After the Texas Senate approved HB 4 — the Texas Data Privacy and Security Act (“TDPSA”), on June 18, 2023, Texas became the eleventh state to enact comprehensive privacy legislation.[1]

Like many state consumer data privacy laws enacted this year, TDPSA is largely modeled after the Virginia Consumer Data Protection Act.[2] However, the law contains several unique differences and drew significant pieces from recently enacted consumer data privacy laws in Colorado and Connecticut, which generally include “stronger” provisions than the more “business-friendly” laws passed in states like Utah and Iowa.

Some of the more notable provisions of the bill are described below:

More Scope Than You Can Shake a Stick At!

  • The TDPSA applies much more broadly than any other pending or effective state consumer data privacy act, pulling in individuals as well as businesses regardless of their revenues or the number of individuals whose personal data is processed or sold.
  • The TDPSA applies to any individual or business that meets all of the following criteria:
    • conduct business in Texas (or produce goods or services consumed in Texas) and,
    •  process or sell personal data:
      • The “processing or sale of personal data” further expands the applicability of the TDPSA to include individuals and businesses that engage in any operations involving personal data, such as the “collection, use, storage, disclosure, analysis, deletion, or modification of personal data.”
      • In short, collecting, storing or otherwise handling the personal data of any resident of Texas, or transferring that data for any consideration, will likely meet this standard.
  • Uniquely, the carveout for “small businesses” excludes from coverage those entities that meet the definition of “a small business as defined by the United States Small Business Administration.”[3]
  • The law requires all businesses, including small businesses, to obtain opt-in consent before processing sensitive personal data.
  • Similar to other state comprehensive privacy laws, TDPSA excludes state agencies or political subdivisions of Texas, financial institutions subject to Title V of the Gramm-Leach-Bliley Act, covered entities and business associates governed by HIPAA, nonprofit organizations, and institutions of higher education. But, TDPSA uniquely excludes electric utilities, power generation companies, and retail electric providers, as defined under Section 31.002 of the Texas Utilities Code.
  • Certain categories of information are also excluded, including health information protected by HIPAA or used in connection with human clinical trials, and information covered by the Fair Credit Reporting Act, the Driver’s Privacy Protection Act, the Family Educational Rights and Privacy Act of 1974, the Farm Credit Act of 1971, emergency contact information used for emergency contact purposes, and data necessary to administer benefits.

Don’t Mess with Texas Consumers

Texas’s longstanding libertarian roots are evidenced in the TDPSA’s strong menu of individual consumer privacy rights, including the right to:

  • Confirm whether a controller is processing the consumer’s personal data and accessing that data;
  • Correct inaccuracies in the consumer’s personal data, considering the nature of the data and the purposes of the processing;
  • Delete personal data provided by or obtained about the consumer;
  • Obtain a copy of the consumer’s personal data that the consumer previously provided to a controller in a portable and readily usable format, if the data is available digitally and it is technically feasible; and
  • Opt-out of the processing of personal data for purposes of targeted advertising, the sale of personal data, or profiling in furtherance of a decision that produces legal or similarly significant legal effects concerning the consumer.

Data controllers are required to respond to consumer requests within 45 days, which may be extended by 45 days when reasonably necessary. The bill would also give consumers a right to appeal a controller’s refusal to respond to a request.

Controller Hospitality

The Texas bill imposes a number of obligations on data controllers, most of which are similar to other state consumer data privacy laws:

  • Data Minimization – Controllers should limit data collection to what is “adequate, relevant, and reasonably necessary” to achieve the purposes of collection that have been disclosed to a consumer. Consent is required before processing information in ways that are not reasonably necessary or not compatible with the purposes disclosed to a consumer.
  • Nondiscrimination – Controllers may not discriminate against a consumer for exercising individual rights under the TDPSA, including by denying goods or services, charging different rates, or providing different levels of quality.
  • Sensitive Data – Consent is required before processing sensitive data, which includes personal data revealing racial or ethnic origin, religious beliefs, mental or physical health diagnosis, citizenship or immigration status, genetic or biometric data processed for purposes of uniquely identifying an individual; personal data collected from a child known to be under the age of 13, and precise geolocation data.
    • The Senate version of the bill excludes data revealing “sexual orientation” from the categories of sensitive information, which differs from all other state consumer data privacy laws.
  • Privacy Notice – Controllers must post a privacy notice (e.g. website policy) that includes (1) the categories of personal data processed by the controller (including any sensitive data), (2) the purposes for the processing, (3) how consumers may exercise their individual rights under the Act, including the right of appeal, (4) any categories of personal data that the controller shares with third parties and the categories of those third parties, and (5) a description of the methods available to consumers to exercise their rights (e.g., website form or email address).
  • Targeted Advertising – A controller that sells personal data to third parties for purposes of targeted advertising must clearly and conspicuously disclose to consumers their right to opt-out.

Assessing the Privacy of Texans

Unlike some of the “business-friendly” privacy laws in Utah and Iowa, the Texas bill requires controllers to conduct data protection assessments (“Data Privacy Protection Assessments” or “DPPAs) for certain types of processing that pose heightened risks to consumers. The assessments must identify and weigh the benefits of the processing to the controller, the consumer, other stakeholders, and the public against the potential risks to the consumer as mitigated by any safeguards that could reduce those risks. In Texas, the categories that require assessments are identical to those required by Connecticut’s consumer data privacy law and include:

  • Processing personal data for targeted advertising;
  • The sale of personal data;
  • Processing personal data for profiling consumers, if such profiling presents a reasonably foreseeable risk to consumers of unfair or deceptive treatment, disparate impact, financial, physical or reputational injury, physical or other intrusion upon seclusion of private affairs, or “other substantial injury;”
  • Processing of sensitive data; and
  • Any processing activities involving personal data that present a “heightened risk of harm to consumers.”

Opting Out and About

Businesses are required to recognize a universal opt-out mechanism for consumers (or, Global Privacy Control signal), similar to provisions required in Colorado, Connecticut, California, and Montana, but it would also allow businesses more leeway to ignore those signals if it cannot verify the consumers’ identity or lacks the technical ability to receive it.

Show Me Some Swagger!

The Attorney General has the exclusive right to enforce the law, punishable by civil penalties of up to $7,500 per violation. Businesses have a 30-day right to cure violations upon written notice from the Attorney General. Unlike several other laws, the right to cure has no sunset provision and would remain a permanent part of the law. The law does not include a private right of action.

Next Steps for TDPSA Compliance

For businesses that have already developed a state privacy compliance program, especially those modeled around Colorado and Connecticut, making room for TDPSA will be a streamlined exercise. However, businesses that are starting from ground zero, especially “small businesses” defined in the law, need to get moving.

If TDPSA is your first ride in a state consumer privacy compliance rodeo, some first steps we recommend are:

  1. Update your website privacy policy for facial compliance with the law and make sure that notice is being given at or before the time of collection.
  2. Put procedures in place to respond to consumer privacy requests and ask for consent before processing sensitive information
  3. Gather necessary information to complete data protection assessments.
  4. Identify vendor contracts that should be updated with mandatory data protection terms.

Footnotes

[1] As of date of publication, there are now 17 states that have passed state consumer data privacy laws (California, Colorado, Connecticut, Delaware, Florida, Indiana, Iowa, Kentucky, Maryland, Massachusetts, Montana, New Jersey, New Hampshire, Tennessee, Texas, Utah, Virginia) and two (Vermont and Minnesota) that are pending.

[2] See, Code of Virginia Code – Chapter 53. Consumer Data Protection Act

[3] This is notably broader than other state privacy laws, which establish threshold requirements based on revenues or the amount of personal data that a business processes. It will also make it more difficult to know what businesses are covered because SBA definitions vary significantly from one industry vertical to another. As a quick rule of thumb, under the current SBA size standards, a U.S. business with annual average receipts of less than $2.25 million and fewer than 100 employees will likely be small, and therefore exempt from the TDPSA’s primary requirements.

For more news on State Privacy Laws, visit the NLR Consumer Protection and Communications, Media & Internet sections.

Mid-Year Recap: Think Beyond US State Laws!

Much of the focus on US privacy has been US state laws, and the potential of a federal privacy law. This focus can lead one to forget, however, that US privacy and data security law follows a patchwork approach both at a state level and a federal level. “Comprehensive” privacy laws are thus only one piece of the puzzle. There are federal and state privacy and security laws that apply based on a company’s (1) industry (financial services, health care, telecommunications, gaming, etc.), (2) activity (making calls, sending emails, collecting information at point of purchase, etc.), and (3) the type of individual from whom information is being collected (children, students, employees, etc.). There have been developments this year in each of these areas.

On the industry law, there has been activity focused on data brokers, those in the health space, and for those that sell motor vehicles. The FTC has focused on the activities of data brokers this year, beginning the year with a settlement with lead-generation company Response Tree. It also settled with X-Mode Social over the company’s collection and use of sensitive information. There have also been ongoing regulation and scrutiny of companies in the health space, including HHS’s new AI transparency rule. Finally, in this area is a new law in Utah, with a Motor Vehicle Data Protection Act applicable to data systems used by car dealers to house consumer information.

On the activity side, there has been less news, although in this area the “activity” of protecting information (or failing to do so) has continued to receive regulatory focus. This includes the SEC’s new cybersecurity reporting obligations for public companies, as well as minor modifications to Utah’s data breach notification law.

Finally, there have been new laws directed to particular individuals. In particular, laws intended to protect children. These include social media laws in Florida and Utah, effective January 1, 2025 and October 1, 2024 respectively. These are similar to attempts to regulate social media’s collection of information from children in Arkansas, California, Ohio and Texas, but the drafters hope sufficiently different to survive challenges currently being faced by those laws. The FTC is also exploring updates to its decades’ old Children’s Online Privacy Protection Act.

Putting It Into Practice: As we approach the mid-point of the year, now is a good time to look back at privacy developments over the past six months. There have been many developments in the privacy patchwork, and companies may want to take the time now to ensure that their privacy programs have incorporated and addressed those laws’ obligations.

Listen to this post

Congress Introduces Promising Bipartisan Privacy Bill

U.S. Senator Maria Cantwell (D-WA) and U.S. Representative Cathy McMorris Rodgers (R-WA) have made a breakthrough by agreeing on a bipartisan data privacy legislation proposal. The legislation aims to address concerns related to consumer data collection by technology companies and empower individuals to have control over their personal information.

The proposed legislation aims to restrict the amount of data technology companies can gather from consumers. This step is particularly important given the large amount of data these technology companies possess. It would grant Americans the authority to prevent the sale of their personal information or request its deletion. This step gives individuals more control over their personal data. The Federal Trade Commission (FTC) and state attorneys general would be given significant authority to monitor and regulate matters related to consumer privacy. This measure will ensure that the government has a say in matters associated with consumer privacy. The bill includes robust enforcement measures, such as granting individuals the right to take legal action. This step is necessary to ensure that any violations of the legislation are dealt with effectively. While targeted advertising would not be prohibited, the proposed legislation would allow consumers to opt out of it. This step gives consumers more control over the ads they receive. The privacy violations listed in the legislation would also be applicable to telecommunications companies. This measure ensures that no company is exempt from consumer privacy laws. Annual assessments of algorithms would be conducted to ensure that they do not harm individuals, particularly young people. This is an important, step given the rise of technology and its impact on consumers, especially among younger generations.

The bipartisan proposal for data privacy legislation is a positive step forward in terms of consumer privacy in America. While there is still work to be done, it is essential that the government takes proactive steps to ensure that individuals have greater control over their personal data. This is a positive development for the tech industry and consumers alike.

However, as we reported on before, this is not the first time Congress has made strides towards comprehensive data privacy legislation,). Hopefully, this new bipartisan bill will enjoy more success than past efforts and bring the United States closer in line with international data privacy standards.

Supply Chains are the Next Subject of Cyberattacks

The cyberthreat landscape is evolving as threat actors develop new tactics to keep up with increasingly sophisticated corporate IT environments. In particular, threat actors are increasingly exploiting supply chain vulnerabilities to reach downstream targets.

The effects of supply chain cyberattacks are far-reaching, and can affect downstream organizations. The effects can also last long after the attack was first deployed. According to an Identity Theft Resource Center report, “more than 10 million people were impacted by supply chain attacks targeting 1,743 entities that had access to multiple organizations’ data” in 2022. Based upon an IBM analysis, the cost of a data breach averaged $4.45 million in 2023.

What is a supply chain cyberattack?

Supply chain cyberattacks are a type of cyberattack in which a threat actor targets a business offering third-party services to other companies. The threat actor will then leverage its access to the target to reach and cause damage to the business’s customers. Supply chain cyberattacks may be perpetrated in different ways.

  • Software-Enabled Attack: This occurs when a threat actor uses an existing software vulnerability to compromise the systems and data of organizations running the software containing the vulnerability. For example, Apache Log4j is an open source code used by developers in software to add a function for maintaining records of system activity. In November 2021, there were public reports of a Log4j remote execution code vulnerability that allowed threat actors to infiltrate target software running on outdated Log4j code versions. As a result, threat actors gained access to the systems, networks, and data of many organizations in the public and private sectors that used software containing the vulnerable Log4j version. Although security upgrades (i.e., patches) have since been issued to address the Log4j vulnerability, many software and apps are still running with outdated (i.e., unpatched) versions of Log4j.
  • Software Supply Chain Attack: This is the most common type of supply chain cyberattack, and occurs when a threat actor infiltrates and compromises software with malicious code either before the software is provided to consumers or by deploying malicious software updates masquerading as legitimate patches. All users of the compromised software are affected by this type of attack. For example, Blackbaud, Inc., a software company providing cloud hosting services to for-profit and non-profit entities across multiple industries, was ground zero for a software supply chain cyberattack after a threat actor deployed ransomware in its systems that had downstream effects on Blackbaud’s customers, including 45,000 companies. Similarly in May 2023, Progress Software’s MOVEit file-transfer tool was targeted with a ransomware attack, which allowed threat actors to steal data from customers that used the MOVEit app, including government agencies and businesses worldwide.

Legal and Regulatory Risks

Cyberattacks can often expose personal data to unauthorized access and acquisition by a threat actor. When this occurs, companies’ notification obligations under the data breach laws of jurisdictions in which affected individuals reside are triggered. In general, data breach laws require affected companies to submit notice of the incident to affected individuals and, depending on the facts of the incident and the number of such individuals, also to regulators, the media, and consumer reporting agencies. Companies may also have an obligation to notify their customers, vendors, and other business partners based on their contracts with these parties. These reporting requirements increase the likelihood of follow-up inquiries, and in some cases, investigations by regulators. Reporting a data breach also increases a company’s risk of being targeted with private lawsuits, including class actions and lawsuits initiated by business customers, in which plaintiffs may seek different types of relief including injunctive relief, monetary damages, and civil penalties.

The legal and regulatory risks in the aftermath of a cyberattack can persist long after a company has addressed the immediate issues that caused the incident initially. For example, in the aftermath of the cyberattack, Blackbaud was investigated by multiple government authorities and targeted with private lawsuits. While the private suits remain ongoing, Blackbaud settled with state regulators ($49,500,000), the U.S. Federal Trade Commission, and the U.S. Securities Exchange Commission (SEC) ($3,000,000) in 2023 and 2024, almost four years after it first experienced the cyberattack. Other companies that experienced high-profile cyberattacks have also been targeted with securities class action lawsuits by shareholders, and in at least one instance, regulators have named a company’s Chief Information Security Officer in an enforcement action, underscoring the professional risks cyberattacks pose to corporate security leaders.

What Steps Can Companies Take to Mitigate Risk?

First, threat actors will continue to refine their tactics and techniques. Thus, all organizations must adapt and stay current with all regulations and legislation surrounding cybersecurity. Cybersecurity and Infrastructure Security Agency (CISA) urges developer education for creating secure code and verifying third-party components.

Second, stay proactive. Organizations must re-examine not only their own security practices but also those of their vendors and third-party suppliers. If third and fourth parties have access to an organization’s data, it is imperative to ensure that those parties have good data protection practices.

Third, companies should adopt guidelines for suppliers around data and cybersecurity at the outset of a relationship since it may be difficult to get suppliers to adhere to policies after the contract has been signed. For example, some entities have detailed processes requiring suppliers to inform of attacks and conduct impact assessments after the fact. In addition, some entities expect suppliers to follow specific sequences of steps after a cyberattack. At the same time, some entities may also apply the same threat intelligence that it uses for its own defense to its critical suppliers, and may require suppliers to implement proactive security controls, such as incident response plans, ahead of an attack.

Finally, all companies should strive to minimize threats to their software supply by establishing strong security strategies at the ground level.