What’s in the American Data Privacy and Protection Act?

Congress is considering omnibus privacy legislation, and it reportedly has bipartisan support. If passed, this would be a massive shake-up for American consumer privacy, which has been left to the states up to this point. So, how does the American Data Privacy and Protection Act (ADPPA) stack up against existing privacy legislation such as the California Consumer Privacy Act and the Virginia Consumer Data Protection Act?

The ADPPA includes a much broader definition of sensitive data than we’ve seen in state-level laws. Some notable inclusions are income level, voicemails and text messages, calendar information, data relating to a known child under the age of 17, and depictions of an individual’s “undergarment-clad” private area. These enumerated categories go much further than recent state laws, which tend to focus on health and demographic information. One asterisk though – unlike other state laws, the ADPPA only considers sexual orientation information to be sensitive when it is “inconsistent with the individual’s reasonable expectation” of disclosure. It’s unclear at this point, for example, if a member of the LGBTQ+ community who is out to friends would have a “reasonable expectation” not to be outed to their employer.

Like the European Union’s General Data Protection Regulation, the ADPPA includes a duty of data minimization on covered entities (the ADPPA borrows the term “covered entity” from HIPAA). There is a laundry list of exceptions to this rule, including one for using data collected prior to passage “to conduct internal research.” Companies used to kitchen-sink analytics practices may appreciate this savings clause as they adjust to making do with less access to consumer data.

Another innovation is a tiered applicability, in which all commercial entities are “covered entities,” but “large data holders” – those making over $250,000,000 gross revenue and that process either 5,000,000 individuals’ data or 200,000 individuals’ sensitive data – are subject to additional requirements and limitations, while “small businesses” enjoy additional exemptions. Until now, state consumer privacy laws have made applicability an all-or-nothing proposition. All covered entities, though, would be required to comply with browser opt-out signals, following a trend started by the California Privacy Protection Agency’s recent draft regulations. Additionally, individuals have a private right of action against covered entities to seek monetary and injunctive relief.

Finally, and controversially, the ADPPA explicitly preempts all state privacy laws. It makes sense – the globalized nature of the internet means that any less-stringent state law would become the exception that kills the rule. Still, companies that only recently finalized CCPA- and CPRA-compliance programs won’t appreciate being sent back to the drawing board.

Read the bill for yourself here.

Copyright © 2022 Robinson & Cole LLP. All rights reserved.

Biden Administration Seeks to Clarify Patient Privacy Protections Post-Dobbs, Though Questions Remain

On July 8, two weeks following the Supreme Court’s ruling in Dobbs v. Jackson that invalidated the constitutional right to abortion, President Biden signed Executive Order 14076 (E.O.). The E.O. directed federal agencies to take various actions to protect access to reproductive health care services,[1] including directing the Secretary of the U.S. Department of Health and Human Services (HHS) to “consider actions” to strengthen the protection of sensitive healthcare information, including data on reproductive healthcare services like abortion, by issuing new guidance under the Health Insurance and Accountability Act of 1996 (HIPAA).[2]

The directive bolstered efforts already underway by the Biden Administration. A week before the E.O. was signed, HHS Secretary Xavier Becerra directed the HHS Office for Civil Rights (OCR) to take steps to ensure privacy protections for patients who receive, and providers who furnish, reproductive health care services, including abortions.[3] The following day, OCR issued two guidance documents to carry out this order, which are described below.

Although the guidance issued by OCR clarifies the privacy protections as they exist under current law post-Dobbs, it does not offer patients or providers new or strengthened privacy rights. Indeed, the guidance illustrates the limitations of HIPAA regarding protection of health information of individuals related to abortion services.

A.  HHS Actions to Safeguard PHI Post-Dobbs

Following Secretary Becerra’s press announcement, OCR issued two new guidance documents outlining (1) when the HIPAA Privacy Rule may prevent the unconsented disclosure of reproductive health-related information; and (2) best practices for consumers to protect sensitive health information collected by personal cell phones, tablets, and apps.

(1) HIPAA Privacy Rule and Disclosures of Information Relating to Reproductive Health Care

In the “Guidance to Protect Patient Privacy in Wake of Supreme Court Decision on Roe,”[4] OCR addresses three existing exceptions in the HIPAA Privacy Rule to the disclosure of PHI without an individual’s authorization and provides examples of how those exceptions may be applied post-Dobbs.

The three exceptions discussed in the OCR guidance are the exceptions for disclosures required by law,[5]  for purposes of law enforcement,[6] or to avert a serious threat to health or safety.[7]

While the OCR guidance reiterates that the Privacy Rule permits, “but does not require” disclosure of PHI in each of these exceptions,[8] this offers limited protection that relies on the choice of providers whether to disclose or not disclose the information. Although these exceptions are highlighted as “protections,” they expressly permit the disclosure of protected health information. Further, while true that the HIPAA Privacy Rule itself may not compel disclosure (but merely permits disclosure), the guidance fails to mention that in many situations in which these exceptions apply, the provider will have other legal authority (such as state law) mandating the disclosure and thus, a refusal to disclose the PHI may be unlawful based on a law other than HIPAA.

Two of the exceptions discussed in the guidance – the required by law exception and the law enforcement exception – both only apply in the first place when valid legal authority is requiring disclosure. In these situations, the fact that HIPAA does not compel disclosure is of no relevance. Certainly, when there is not valid legal authority requiring disclosure of PHI, then HIPAA prohibits disclosure, as noted as in the OCR guidance.  However, in states with restrictive abortion laws, the state legal authorities are likely to be designed to require disclosure – which HIPAA does not prevent.

For instance, if a health care provider receives a valid subpoena from a Texas court that is ordering the disclosure of PHI as part of a case against an individual suspected of aiding and abetting an abortion, in violation of Texas’ S.B. 8, then that provider could be held in contempt of court for failing to comply with the subpoena, despite the fact that HIPAA does not compel disclosure.[9] For more examples on when a covered entity may be required to disclose PHI, please see EBG’s prior blog: The Pendulum Swings Both Ways: State Responses to Protect Reproductive Health Data, Post-Roe.[10]

Notably, the OCR guidance does provide a new interpretation of the application of the exception for disclosures to avert a serious threat to health or safety. Under this exception, covered entities may disclose PHI, consistent with applicable law and standards of ethical conduct, if the covered entity, in good faith, believes the use or disclosure is necessary to prevent or lessen a serious and imminent threat to the health or safety of a person or the public. OCR states that it would be inconsistent with professional standards of ethical conduct to make such a disclosure of PHI to law enforcement or others regarding an individual’s interest, intent, or prior experience with reproductive health care. Thus, in the guidance, OCR takes the position that if a patient in a state where abortion is prohibited informs a health care provider of the patient’s intent to seek an abortion that would be legal in another state, this would not fall into the exception for disclosures to avert a serious threat to health or safety.  Covered entities should be aware of OCR’s position and understand that presumably OCR would view any such disclosure as a HIPAA violation.

(2) Protecting the Privacy and Security of Individuals’ Health Information When Using Personal Cell Phones or Tablets

OCR also issued guidance on how individuals can best protect their PHI on their own personal devices. HIPAA does not generally protect the privacy or security of health information when it is accessed through or stored on personal cell phones or tablets. Rather, HIPAA only applies when PHI is created, received, maintained, or transmitted by covered entities and business associates. As a result, it is not unlawful under HIPAA for information collected by devices or apps – including data pertaining to reproductive healthcare – to be disclosed without consumer’s knowledge.[11]

In an effort to clarify HIPAA’s limitation to protect such information, OCR issued guidance to protect consumer sensitive information stored in personal devices and apps.[12] This includes step-by-step guidance on how to control data collection on their location, and how to securely dispose old devices.[13]

Further, some states have taken steps to fill the legal gaps to varying degrees of success. For example, California’s Confidentiality of Medical Information Act (“CMIA”) extends to “any business that offers software or hardware to consumers, including a mobile application or other related device that is designed to maintain medical information.”[14] As applied, a direct-to-consumer period tracker app provided by a technology company, for example, would fall under the CMIA’s data privacy protections, but not under HIPAA. Regardless, gaps remain as the CMIA does not protect against a Texas prosecutor subpoenaing information from the direct-to-consumer app. Conversely, Connecticut’s new reproductive health privacy law,[15] does prevent a Connecticut covered entity from disclosing reproductive health information based on a subpoena, but Connecticut’s law does not apply to non-covered entities, such as a period tracker app. Therefore, even the U.S.’s most protective state privacy laws do not fill in all of the privacy gaps.

Alongside OCR’s guidance, the Federal Trade Commission (FTC) published a blog post warning companies with access to confidential consumer information to consider FTC’s enforcement powers under Section 5 of the FTC Act, as well as the Safeguards Rule, the Health Breach Notification Rule, and the Children’s Online Privacy Protection Rule.[16] Consistent with OCR’s guidance, the FTC’s blog post reiterates the Biden Administration’s goal of protecting reproductive health data post-Dobbs, but does not go so far as to create new privacy protections relative to current law.

B.  Despite the Biden Administration’s Guidance, Questions Remain Regarding the Future of Reproductive Health Privacy Protections Post-Dobbs

Through E.O. 14076, Secretary Becerra’s press conference, OCR’s guidance, and the FTC’s blog, the Biden Administration is signaling that it intends to use the full force of its authorities – including those vested by HIPAA – to protect patient privacy in the wake of Roe.

However, it remains unclear how this messaging will translate to affirmative executive actions, and how successful such executive actions would be. How far is the executive branch willing to push reproductive rights? Would more aggressive executive actions be upheld by a Supreme Court that just struck down decades of precedent permitting access to abortion? Will the Biden Administration’s executive actions persist if the administration changes in the next Presidential election?

Attorneys at Epstein Becker & Green are well-positioned to assist covered entities, business associates, and other companies holding sensitive reproductive health data understand how to navigate HIPAA’s exemptions and interactions with emerging guidance, regulations, and statutes at both the state and Federal levels.

Ada Peters, a 2022 Summer Associate (not admitted to the practice of law) in the firm’s Washington, DC office and Jack Ferdman, a 2022 Summer Associate (not admitted to the practice of law) in the firm’s Boston office, contributed to the preparation of this post. 



[1] 87 Fed. Reg. 42053 (Jul. 8, 2022), https://bit.ly/3b4N4rp.

[2] Id.

[3] HHS, Remarks by Secretary Xavier Becerra at the Press Conference in Response to President Biden’s Directive following Overturning of Roe v. Wade (June 28, 2022), https://bit.ly/3zzGYsf.

[4] HHS, Guidance to Protect Patient Privacy in Wake of Supreme Court Decision on Roe (June 29, 2022),  https://bit.ly/3PE2rWK.

[5] 45 CFR 164.512(a)(1)

[6] 45 CFR 164.512(f)(1)

[7] 45 CFR 164.512(j)

[8] Id.

[9] See Texas S.B. 8; e.g., Fed. R. Civ. Pro. R.37 (outlining available sanctions associated with the failure to make disclosures or to cooperate in discovery in Federal courts), https://bit.ly/3BjX4I2.

[10] EBG Health Law Advisor, The Pendulum Swings Both Ways: State Responses to Protect Reproductive Health Data, Post-Roe (June 17, 2022), https://bit.ly/3oPDegl.

[11] A 2019 Kaiser Family Foundation survey concluded that almost one third of female respondents used a smartphone app to monitor their menstrual cycles and other reproductive health data. Kaiser Family Foundation, Health Apps and Information Survey (Sept. 2019), https://bit.ly/3PC9Gyt.

[12] HHS, Protecting the Privacy and Security of Your Health Information When Using Your Personal Cell Phone1 or Tablet (last visited Jul. 26, 2022), https://bit.ly/3S2MNWs.

[13] Id.

[14] Cal. Civ. Code § 56.10, Effective Jan. 1, 2022, https://bit.ly/3J5iDxM.

[15] 2022 Conn. Legis. Serv. P.A. 22-19 § 2 (S.B. 5414), Effective July 1, 2022, https://bit.ly/3zwn95c.

[16] FTC, Location, Health, and Other Sensitive Information: FTC Committed To Fully Enforcing the Law Against Illegal Use and Sharing of Highly Sensitive Data (July 11, 2022), https://bit.ly/3BjrzNV.

©2022 Epstein Becker & Green, P.C. All rights reserved.

Judge Approves $92 Million TikTok Settlement

On July 28, 2022, a federal judge approved TikTok’s $92 million class action settlement of various privacy claims made under state and federal law. The agreement will resolve litigation that began in 2019 and involved claims that TikTok, owned by the Chinese company ByteDance, violated the Illinois Biometric Information Privacy Act (“BIPA”) and the federal Video Privacy Protection Act (“VPPA”) by improperly harvesting users’ personal data. U.S. District Court Judge John Lee of the Northern District of Illinois also awarded approximately $29 million in fees to class counsel.

The class action claimants alleged that TikTok violated BIPA by collecting users’ faceprints without their consent and violated the VPPA by disclosing personally identifiable information about the videos people watched. The settlement agreement also provides for several forms of injunctive relief, including:

  • Refraining from collecting and storing biometric information, collecting geolocation data and collecting information from users’ clipboards, unless this is expressly disclosed in TikTok’s privacy policy and done in accordance with all applicable laws;
  • Not transmitting or storing U.S. user data outside of the U.S., unless this is expressly disclosed in TikTok’s privacy policy and done in accordance with all applicable laws;
  • No longer pre-uploading U.S. user generated content, unless this is expressly disclosed in TikTok’s privacy policy and done in accordance with all applicable laws;
  • Deleting all pre-uploaded user generated content from users who did not save or post the content; and
  • Training all employees and contractors on compliance with data privacy laws and company procedures.
Copyright © 2022, Hunton Andrews Kurth LLP. All Rights Reserved.

What Employers Need to Know in a Post-Dobbs Landscape

On June 24, 2022, in Dobbs v. Jackson Women’s Health Organization, the United States Supreme Court overturned both Roe v. Wade and Planned Parenthood v. Casey and held the access to abortion is not a right protected by the United States Constitution. This article analyzes several employment law issues employers may face following the Dobbs decision.

Federal Law

The Pregnancy Discrimination Act (PDA) prohibits employment discrimination “on the basis of pregnancy, childbirth, or related medical conditions.” In construing the PDA’s reference to “childbirth”, federal courts around the country have held the PDA prevents employers from taking adverse employment actions (including firing, demotion, or preventing the opportunity for advancement) because of an employee’s decision to have an abortion as well as an employee’s contemplation of an abortion. The PDA also prohibits adverse employment actions based upon an employee’s decision not to have an abortion. So, for example, an employer would violate the PDA if it pressured an employee to have, or not to have, an abortion in order to keep her job or be considered for a promotion.

State Law

Several states have implemented “trigger laws,” which impose restrictions or categorical bans on abortion following Dobbs. In addition, states such as Texas have enacted laws that allow individuals to file civil actions against entities that “knowingly engage in conduct that aids or abets the performance or inducement of an abortion, including paying for or reimbursing the cost of an abortion through insurance or otherwise.” Relying on that law, Texas legislators have already threatened at least two high profile employers for implementing policies which reimburse travel costs for abortion care unavailable in an employee’s home state. Although the Texas statute is currently being challenged in court, its text provides for statutory damages “in an amount of not less than $10,000” for “each abortion . . . induced.”

Although the issue has not been litigated yet, courts will likely have to decide how the PDA’s protections interact with a state’s anti-abortion laws.

Employer Handbook Policies and Procedures

The Dobbs decision may also impact workplace morale and productivity. Accordingly, employers should consider reviewing their handbooks as well as policies and procedures, with human resources and managers to ensure requisite familiarity with the employer’s social media policy, dress code, code of conduct, and how the employer handles confidential health information. Employers should be prepared for increased public expression from the workforce—including social media posts, discussions with other employees and third parties, and wearing clothing or other accessories reflecting strong opinions. Human resources should also be prepared for an increase in leave requests and employee resignations.

Travel Benefits for Employees Seeking Reproductive Care

In the wake of Dobbs, many businesses in states where access to abortion will be prohibited or highly restricted are considering—or have already implemented—benefit or employee expense plan amendments that would cover travel and lodging for out-of-state abortions. Ultimately, the legal and regulatory future for such plans remains unclear; especially in states where abortion laws are the most restrictive and contain “aiding and abetting” liability.

At a high level, employers seeking to enact such benefit or expense plans may find some comfort in a statement contained in Justice Kavanaugh’s concurrence in Dobbs. Specifically, Justice Kavanaugh wrote:

  • Some of the other abortion related legal questions raised by today’s decision are not especially difficult as a constitutional matter. For example, may a State bar a resident of that State from traveling to another State to obtain an abortion? In my view, the answer is no based on the constitutional right to interstate travel.

Thus, it appears that outright travel bans or similar prohibitive restrictions would face significant legal challenges, and could be declared void.

At this early stage in the post-Roe era, there appear to be several ‘paths’ emerging for employers seeking to provide travel benefits. Each comes with its own set of potential issues and considerations that employers, in conjunction with their counsel and benefit providers, should evaluate carefully. Below is a brief discussion of some of the travel-reimbursement plans employers have begun to implement or consider in the wake of Dobbs:

  1. Travel and lodging benefits under existing group health plans.
    • Assuming the plans are self-funded and subject to ERISA, they must also comply with other applicable rules such as HIPAA and the ACA.
    • Such benefits may not be available under non-ERISA plans in states restricting abortion access.
    • Generally would be limited to individuals enrolled in the employer’s plan.
  2. Travel and lodging benefits under Health Reimbursement Arrangements (HRA’s).
    • An HRA is a type of health savings account offering tax-free reimbursement up to a fixed amount each year.
    • HRA’s are generally subject to ERISA and cannot reimburse above the very minimal IRS limits (Section 213), such as mileage (.18 cents) and lodging ($50/per day).
    • Should be integrated with other coverage or qualify as an “Excepted Benefit HRA” or else it may violate certain ACA rules that prohibit lifetime annual dollar limits for certain benefits.
  3. Employee Assistance Programs (EAP’s).
    • EAP’s are voluntary benefit programs some employers use to allow employees access to certain types of care without accruing co-pays, deductibles, or out of pocket costs. Historically, EAP’s have been predominately used for mental health benefits such as therapy or substance abuse counseling.
    • In certain circumstances, EAP’s are exempt from the ACA. To be an “excepted benefit,” the EAP:
      • Cannot provide significant benefits in the nature of medical care or treatment;
      • Cannot be coordinated with benefits under another group health plan;
      • Cannot charge a premium for participation; and
      • Cannot require cost sharing for offered services.
    • The first of the above requirements (significant benefits of a medical nature) is highly subjective and may create risk for employers because it is difficult to determine whether a benefit is “significant.” Accordingly, it may be difficult to locate a third-party vendor or provider that would administer travel and lodging benefits through an EAP.
  4. Travel and lodging benefits to employees as taxable reimbursements.
    • Taxable reimbursements—up to a certain amount annually—for travel to obtain abortion or other medical care not available in the employee’s place of residence.
    • Some employers are requiring only receipts for lodging, but are not requesting substantiation of the employee’s abortion procedure. Some argue this might insulate an employer from liability in states with statutes prohibiting “aiding or abetting” an abortion, on the grounds that the employer does not know what the employee is using the benefit for. Ultimately, whether that is true remains largely untested and unclear.
    • Likely more costly for the employer, because the benefit is broader in scope. In addition, employers may run the risk that a payroll reimbursement of this kind could qualify as setting up a “new medical plan,” thereby raising compliance and other related issues.

Additionally, employer travel-and-lodging benefits of this type present innumerable other questions and issues. Such questions should include:

  1. Is the employer’s benefit plan subject to ERISA?
    • ERISA is the federal law applicable to qualifying employee benefits plans, including employer-sponsored group health plans. Plans subject to ERISA must also comply with HIPAA, the ACA, and other applicable rules and regulations. So-called self-funded employer plans are subject to ERISA.
    • With some exceptions, ERISA preempts or blocks the implementation of state laws that ”relate to” the ERISA plan.
    • However, ERISA does not:
      • Preempt a state law that regulates insurance companies operating in the state; or
      • Preempt state criminal laws of general applicability.
    • If a plan is self-insured and subject to ERISA it may not be required to comply with state laws related to abortion services based on ERISA preemption.
    • However, the impact of new and untested civil and/or criminal penalties remains unclear.
  2. What procedures does the plan cover?
    • In this environment—especially in states with the most restrictive abortion laws—employers should have a firm understanding of what specific type of abortion procedures the plan covers.
  3. Specific or “general” travel stipends?
    • As noted above, some companies are choosing to provide travel/lodging stipends and benefits to access abortion care in jurisdictions where the procedure is lawful.
    • Some employers are making this travel stipend more general—i.e., not requiring the stipend be used for abortion, or otherwise naming abortion in the benefit program. As an example, a policy that provides a stiped for an employee to “travel to receive medical care that is unavailable within 100 miles of the employee’s place of residence.”
    • Note that out-of-plan reimbursements to employees are likely taxable as wages. Some employees may choose to gross up such stipends to compensate.
  4. What about privacy concerns?
    • Employers should think carefully about how to provide any benefits or stipends while protecting employee privacy, not violating HIPAA, and—where applicable—not running afoul of so-called ‘aiding and abetting’ legislation.
    • To that end, as noted above, some companies are requiring only that employees provide travel receipts—not documentation of the underlying procedure—to qualify for the benefit, reimbursement, or stipend.
    • Of course, without any verification, there is always the potential for abuse—or otherwise using the program for something well beyond its core intent, such as travel, elective plastic surgery, etc. However, some employers may evaluate the risk of abuse as worth the potential lessening of privacy and other concerns.

Protected Activity

Employers must also be aware that certain speech in the workplace—including speech about abortion—may be legally protected. Although the First Amendment generally does not extend to private companies, the National Labor Relations Act (NLRA) prohibits retaliation against employees who discuss the terms and conditions of employment, commonly referred to as “protected concerted activity.” Thus, employees (1) discussing or advocating for an employer to provide benefits to women seeking reproductive and abortion-related healthcare services, (2) advocating for the employer to take a certain public stance on the issue, or (3) protesting the employer’s public position on the issue, may constitute protected activity under the NLRA.

Contacts and Next Steps

Employment law issues will continue to arise and evolve in the coming months following the Dobbs decision. The EEOC, DOL, and HHS may provide further guidance on how Dobbs impacts employment laws such as the Family and Medical Leave Act (FMLA), Americans with Disabilities Act (ADA), and PDA. Employers should consult with legal counsel concerning these developments.

Copyright © 2022, Sheppard Mullin Richter & Hampton LLP.

New Survey Shows that Americans are Ready for More Deliveries by Drone

Auterion, a drone software company, commissioned a survey from the market research company, Propeller Insights, of 1,022 adults. The survey was gender-balanced and distributed across age groups from 18 to 65+, living in rural, suburban, and city environments in the United States, and was conducted in May 2022.

In the report summarizing the survey, “Consumer Attitudes on Drone Delivery,” Auterion reveals that 58 percent of Americans like the idea of drone deliveries, and 64 percent think drones are becoming an option for home delivery now or will be in the near future. With more than 80 percent of those surveyed reporting that they have packages delivered to their homes on a regular basis, the survey finds that Americans are generally ready to integrate drone delivery into daily life for ease and speed. Of the 64 percent who see drones becoming a more common option for home delivery, 32 percent think it’s possible now or within the next 1 to 2 years.

Only 36 percent of those surveyed had doubts about this type of drone integration, including some individuals who think the general public or governments will not approve of large-scale drone adoption for delivery and others who just prefer that drone delivery doesn’t happen at all.

With individuals choosing more than one option, the survey found that the most common types of home package deliveries reported by consumers today, by vehicles and trucks, are:

  • 39 percent – groceries

  • 34 percent – clothing

  • 33 percent – household items

  • 31 percent – meals

  • 27 percent – medicine

  • 11 percent – baby food/needs

Based on these findings, those surveyed were also asked if they were willing to consider drones as a “new corner store” for conveniently delivering small and last-minute necessities: 54 percent of the individuals said “yes.”

With regard to concerns related to these drone deliveries, 43 percent of those surveyed fear the drone will break down and they will not receive their item, and 19 percent are worried about not having human interaction with their delivery person. However, drone delivery and systems provide accurate trackability and direct delivery, and, therefore are more capable of accurate delivery timing. Delivery drones are built to analyze the environment with precision, to communicate through control software in a common language and predict safe landing spots for the packages. Air space is becoming a great option in a time when highways are filled with cars and trucks, and fuel prices are rising. Drones can help to reduce our reliance on gas-powered delivery vehicles, and provide safer, more flexible, and more cost-effective delivery.

Copyright © 2022 Robinson & Cole LLP. All rights reserved.

A Rule 37 Refresher – As Applied to a Ransomware Attack

Federal Rule of Civil Procedure 37(e) (“Rule 37”) was completely rewritten in the 2015 amendments.  Before the 2015 amendments, the standard was that a party could not generally be sanctioned for data loss as a result of the routine, good faith operation of its system. That rule didn’t really capture the reality of all of the potential scenarios related to data issues nor did it provide the requisite guidance to attorneys and parties.

The new rule added a dimension of reasonableness to preservation and a roadmap for analysis.  The first guidepost is whether the information should have been preserved. This rule is based upon the common law duty to preserve when litigation is likely. The next guidepost is whether the data loss resulted from a failure to take reasonable steps to preserve. The final guidepost is whether or not the lost data can be restored or replaced through additional discovery.  If there is data that should have been preserved, that was lost because of failure to preserve, and that can’t be replicated, then the court has two additional decisions to make: (1) was there prejudice to another party from the loss OR (2) was there an intent to deprive another party of the information.  If the former, the court may only impose measures “no greater than necessary” to cure the prejudice.  If the latter, the court may take a variety of extreme measures, including dismissal of the action. An important distinction was created in the rule between negligence and intention.

So how does a ransomware attack fit into the new analytical framework? A Special Master in MasterObjects, Inc. v. Amazon.com (U.S. Dist. Court, Northern District of California, March 13, 2022) analyzed Rule 37 in the context of a ransomware attack. MasterObjects was the victim of a well-documented ransomware attack, which precluded the companies access to data prior to 2016. The Special Master considered the declaration from MasterObjects which explained that, despite using state of the art cybersecurity protections, the firm was attacked by hackers in December 2020.  The hack rendered all the files/mailboxes inaccessible without a recovery key set by the attackers.  The hackers demanded a ransom and the company contacted the FBI.  Both the FBI and insurer advised them not to pay the ransom. Despite spending hundreds of hours attempting to restore the data, everything prior to 2016 was inaccessible.

Applying Rule 37, the Special Master stated that, at the outset, there is no evidence that any electronically stored information was “lost.”  The data still exists and, while access has been blocked, it can be accessed in the future if a key is provided or a technological work-around is discovered.

Even if a denial of access is construed to be a “loss,” the Special Master found no evidence in this record that the loss occurred because MasterObjects failed to take reasonable steps to preserve it. This step of the analysis, “failure to take reasonable steps to preserve,” is a “critical, basic element” to prove spoliation.

On the issue of prejudice, Amazon argued that “we can’t know what we don’t know” (related to missing documents).  The Special Master did not find Amazon’s argument persuasive. The Special Master concluded that Amazon’s argument cannot survive the adoption of Rule 37(e). “The rule requires affirmative proof of prejudice in the specific destruction at issue.”

Takeaways:

  1. If you are in a spoliation dispute, make sure you have the experts and evidence to prove or defend your case.

  2. When you are trying to prove spoliation, know the new test and apply it in your analysis (the Special Master noted that Amazon did not reference Rule 37 in its briefing).

  3. As a business owner, when it comes to cybersecurity, you must take reasonable and defensible efforts to protect your data.

©2022 Strassburger McKenna Gutnick & Gefsky

Italian Garante Bans Google Analytics

On June 23, 2022, Italy’s data protection authority (the “Garante”) determined that a website’s use of the audience measurement tool Google Analytics is not compliant with the EU General Data Protection Regulation (“GDPR”), as the tool transfers personal data to the United States, which does not offer an adequate level of data protection. In making this determination, the Garante joins other EU data protection authorities, including the French and Austrian regulators, that also have found use of the tool to be unlawful.

The Garante determined that websites using Google Analytics collected via cookies personal data including user interactions with the website, pages visited, browser information, operating system, screen resolution, selected language, date and time of page views and user device IP address. This information was transferred to the United States without the additional safeguards for personal data required under the GDPR following the Schrems II determination, and therefore faced the possibility of governmental access. In the Garante’s ruling, website operator Caffeina Media S.r.l. was ordered to bring its processing into compliance with the GDPR within 90 days, but the ruling has wider implications as the Garante commented that it had received many “alerts and queries” relating to Google Analytics. It also stated that it called upon “all controllers to verify that the use of cookies and other tracking tools on their websites is compliant with data protection law; this applies in particular to Google Analytics and similar services.”

Copyright © 2022, Hunton Andrews Kurth LLP. All Rights Reserved.

Throwing Out the Privacy Policy is a Bad Idea

The public internet has been around for about thirty years and consumers’ browser-based graphic-heavy experience has existed for about twenty-five years. In the early days, commercial websites operated without privacy policies.

Eventually, people started to realize that they were leaving trails of information online, and in the early ‘aughts the methods for business capturing and profiting from these trails became clear, although the actual uses of the data on individual sites was not clear. People asked for greater transparency from the sites they visited online, and in response received the privacy policy.

A deeply-flawed instrument, the website privacy policy purports to explain how information is gathered and used by a website owner, but most such policies are strangely both imprecise and too long, losing the average reader in a fog of legalese language and marginally relevant facts. Some privacy policies are intentionally obtuse because it doesn’t profit the website operator to make its methods obvious. Many are overly general, in part because the website company doesn’t want to change its policy every time it shifts business practices or vendor alliances. Many are just messy and poorly written.

Part of the reason that privacy policies are confusing is that data privacy is not a precise concept. The definition of data is context dependent. Data can mean the information about a transaction, information gathered from your browser visit (include where you were before and after the visit), information about you or your equipment, or even information derived by analysis of the other information. And we know that de-identified data can be re-identified in many cases, and that even a collection a generic data can lead to one of many ways to identify a person.

The definition of data is context dependent.

The definition of privacy is also untidy. An ecommerce company must capture certain information to fulfill an online order. In this era of connected objects, the company may continue to take information from the item while the consumer is using it. This is true for equipment from televisions to dishwashers to sex toys. The company likely uses this information internally to develop its products. It may use the data to market more goods or services to the consumer. It may transfer the information to other companies so they can market their products more effectively. The company may provide the information to the government. This week’s New Yorker devotes several pages to how the word “privacy” conflates major concepts in US law, including secrecy and autonomy,1 and is thus confusing to courts and public alike.

All of this is difficult to reflect in a privacy policy, even if the company has incentive to provide useful information to its customers.

Last month the Washington Post ran an article by Geoffrey Fowler that was subtitled “Let’s abolish reading privacy policies.” The article notes a 2019 Pew survey claiming that only 9 percent of Americans say they always read privacy policies. I would suggest that more than half of those Americans are lying. Almost no one always reads privacy policies upon first entering a website or downloading an app. That’s not even really what privacy policies are for.

Fowler shows why people do not read these policies. He writes, “As an experiment, I tallied up all of the privacy policies just for the apps on my phone. It totaled nearly 1 million words. “War and Peace” is about half as long. And that’s just my phone. Back in 2008, Lorrie Cranor, a professor of engineering and public policy at Carnegie Mellon University, and a colleague estimated that reading and consenting to all the privacy policies on websites Americans visit would take 244 hours per year.”

The length, complexity and opacity of online privacy policies are concerning. The best alleviation for this concern would not be to eliminate privacy policies, but to make them less instrumental in the most important decisions about descriptive data.

Limit companies’ use of data and we won’t need to fight through their privacy options.

Website owners should not be expected to write out privacy policies that are both sufficiently detailed and succinctly readable so that consumers can make meaningful choices about use of the data that describes them. This type of system forces a person to be responsible for her own data protection and takes the onus off of the company to limit its use of the data. It is like our current system of waste recycling – both ineffective and supported by polluters, because rather than forcing manufacturers to use more environmentally friendly packaging, it pushes consumers to deal with the problem at home, shifting the burden from industry to us.  Similarly, if the legislatures provided a set of simple rules for website operators – here is what you are allowed to do with personal data, and here is what you are not allowed to do with it – then no one would read privacy policies to make sure data about our transactions was spared the worst treatment. The worst treatment would be illegal.

State laws are moving in this direction, providing simpler rules restricting certain uses and transfers of personal data and sensitive data. We are early in the process, but if the trend continues regarding omnibus state privacy laws in the same manner that all states eventually passed data breach disclosure laws, then we can be optimistic and expect full coverage of online privacy rules for all Americans within a decade or so. But we shouldn’t need to wait for all states to comply.

Unlike the data breach disclosure laws which encourage companies to comply only with the laws relevant to their particular loss of data, omnibus privacy laws affect the way companies conduct the normal course of everyday business, so it will only take requirements in a few states before big companies start building their privacy rights recognition functions around the lowest common denominator. It will simply make economic sense for businesses to give every US customer the same rights as most protective state provides its residents. Why build 50 sets of rules when you don’t need to do so? The cost savings of maintaining only one privacy rights-recognition system will offset the cost of providing privacy rights to people in states who haven’t passed omnibus laws yet.

This won’t make privacy policies any easier to read, but it will become less important to read them. Then privacy policies can return to their core function, providing a record of how a company treats data. In other words, a reference document, rather than a set of choices inset into a pillow of legal terms.

We shouldn’t eliminate the privacy policy. We should reduce the importance of such polices, and limit their functions, reducing customer frustration with the privacy policy’s role in our current process. Limit companies’ use of data and we won’t need to fight through their privacy options.


ENDNOTES

1 Privacy law also conflates these meanings with obscurity in a crowd or in public.


Article By Theodore F. Claypoole of Womble Bond Dickinson (US) LLP

Copyright © 2022 Womble Bond Dickinson (US) LLP All Rights Reserved.

The DOJ Throws Cold Water on the Frosties NFT Founders

The U.S. Attorney’s Office for the Southern District of New York recently charged two individuals for allegedly participating in a scheme to defraud purchasers of “Frosties” non-fungible tokens (or “NFTs”) out of over $1 million. The two-count complaint charges Ethan Nguyen (aka “Frostie”) and Andre Llacuna (aka “heyandre”) with conspiracy to commit wire fraud in violation of 18 U.S.C. § 1349 and conspiracy to commit money laundering in violation of 18 U.S.C. § 1956.   Each charge carries a maximum sentence of 20 years in prison.

The Defendants marketed “Frosties” as the entry point to a broader online community consisting of games, reward programs, and other benefits.  In January 2022, their “Frosties” pre-sale raised approximately $1.1 million.

In a so-called “rug pull,” Frostie and heyandre transferred the funds raised through the pre-sale to a series of separate cryptocurrency wallets, eliminated Frosties’ online presence, and took down its website.  The transaction, which was publicly recorded and viewable on the blockchain, triggered investors to sell Frosties at a considerable discount.  Frostie and heyandre then allegedly proceeded to move the funds through a series of transactions intended to obfuscate the source and increase anonymity.  The charges came as the Defendants were preparing for the March 26 pre-sale of their next NFT project, “Embers,” which law enforcement alleges would likely have followed the same course as “Frosties.”

In a public statement announcing the arrests, the DOJ explained how the emerging NFT market is a risk-laden environment that has attracted the attention of scam artists.  Representatives from each of the federal agencies that participated in the investigation cautioned the public and put other potential fraudsters on notice of the government’s watchful eye towards cryptocurrency malfeasance.

This investigation comes on the heels of the FBI’s announcement last month of the Virtual Asset Exploitation Unit, a special task force dedicated to blockchain analysis and virtual asset seizure.  The prosecution of the Defendants in this matter continues aggressive efforts by federal agencies to reign in bad actors participating in the cryptocurrency/digital assets/blockchain space.

Copyright ©2022 Nelson Mullins Riley & Scarborough LLP

WW International to Pay $1.5 Million Civil Penalty for Alleged COPPA Violations

In 2014, with childhood obesity on the rise in the United States, tech company Kurbo, Ltd. (Kurbo) marketed a free app for kids that, according to the company, was “designed to help kids and teens ages 8-17 reach a healthier weight.” When WW International (WW) (formerly Weight Watchers) acquired Kurbo in 2018, the app was rebranded “Kurbo by WW,” and WW continued to market the app to children as young as eight. But according to the Federal Trade Commission (FTC), Kurbo’s privacy practices were not exactly child-friendly, even if its app was. The FTC’s complaint, filed by the Department of Justice (DOJ) last month, claims that WW’s notice, data collection, and data retention practices violated the Children’s Online Privacy Protection Act Rule (COPPA Rule). WW and Kurbo, under a stipulated order, agreed to pay a $1.5 million civil penalty in addition to complying with a range of injunctive provisions. These provisions include, but are not limited to, deleting all personal information of children whose parents did not provide verifiable parental consent in a specified timeframe, and deleting “Affected Work Product” (defined in the order to include any models or algorithms developed in whole or in part using children’s personal information collected through the Kurbo Program).

Complaint Background

The COPPA Rule applies to any operator of a commercial website or online service directed to children that collects, uses, and/or discloses personal information from children and to any operator of a commercial website or online service that has actual knowledge that it collects, uses, and/or discloses personal information from children. Operators must notify parents and obtain their consent before collecting, using, or disclosing personal information from children under 13.

The complaint states that children enrolled in the Kurbo app by signing up through the app or having a parent do it on their behalf. Once on Kurbo, users could enter personal information such as height, weight, and age, and the app then tracked their weight, food consumption, and exercise. However, the FTC alleges that Kurbo’s age gate was porous, requiring no verification process to establish that children who affirmed they were over 13 were the age they claimed to be or that users asserting they were parents were indeed parents. In fact, the complaint alleges that the registration area featured a “tip-off” screen that gave visitors just two choices for registration: the “I’m a parent” option or the “I’m at least 13” option. Visitors saw the legend, “Per U.S. law, a child under 13 must sign up through a parent” on the registration page featuring these choices. In fact, thousands of users who indicated that they were at least 13 were younger and were able to change their information and falsify their real age. Users who lied about their age or who falsely claimed to be parents were able to continue to use the app. In 2020, after a warning from the FTC, Kurbo implemented a registration screen that removed the legend and the “at least 13” option. However, the new process failed to provide verification measures to establish that users claiming to be parents were indeed parents.

Kurbo’s notice of data collection and data retention practices also fell short. The COPPA Rule requires an operator to “post a prominent and clearly labeled link to an online notice of its information practices with regard to children on the home or landing page or screen of its Web site or online service, and, at each area of the Web site or online service where personal information is collected from children.” But beginning in November 2019, Kurbo’s notice at registration was buried in a list of hyperlinks that parents were not required to click through, and the notice failed to list all the categories of information the app collected from children. Further, Kurbo did not comply with the COPPA Rule’s mandate to keep children’s personal information only as long as reasonably necessary for the purpose it was collected and then to delete it. Instead, the company held on to personal information indefinitely unless parents specifically requested its removal.

Stipulated Order

In addition to imposing a $1.5 million civil penalty, the order, which was approved by the court on March 3, 2022, requires WW and Kurbo to:

  • Refrain from disclosing, using, or benefitting from children’s personal information collected in violation of the COPPA Rule;
  • Delete all personal information Kurbo collected in violation of the COPPA Rule within 30 days;
  • Provide a written statement to the FTC that details Kurbo’s process for providing notice and seeking verifiable parental consent;
  • Destroy all affected work product derived from improperly collecting children’s personal information and confirm to the FTC that deletion has been carried out;
  • Delete all children’s personal information collected within one year of the user’s last activity on the app; and
  • Create and follow a retention schedule that states the purpose for which children’s personal information is collected, the specific business need for retaining such information, and criteria for deletion, including a set timeframe no longer than one year.

Implications of the Order

Following the U.S. Supreme Court’s decision in AMG Capital Management, LLC v. Federal Trade Commission, which halted the FTC’s ability to use its Section 13(b) authority to seek monetary penalties for violations of the FTC Act, the FTC has been pushing Congress to grant it greater enforcement powers. In the meantime, the FTC has used other enforcement tools, including the recent resurrection of the agency’s long-dormant Penalty Offense Authority under Section 5(m)(1)(B) of the FTC Act and a renewed willingness to use algorithmic disgorgement (which the FTC first applied in the 2019 Cambridge Analytica case).

Algorithmic disgorgement involves “requir[ing] violators to disgorge not only the ill-gotten data, but also the benefits—here, the algorithms—generated from that data,” as then-Acting FTC Chair Rebecca Kelly Slaughter stated in a speech last year. This order appears to be the first time algorithmic disgorgement was applied by the Commission in an enforcement action under COPPA.

Children’s privacy issues continue to attract the attention of the FTC and lawmakers at both federal and state levels. Companies that collect children’s personal information should be careful to ensure that their privacy policies and practices fully conform to the COPPA Rule.

© 2022 Keller and Heckman LLP