TIK TOK TIK TOK: Time Running Out For Preliminary Court Approval of Multimillion Dollar TikTok Privacy Settlement

On February 25, 2021, Plaintiffs’ Motion for preliminary approval of a $92 million settlement was filed in the ongoing multidistrict litigation, In Re: Tiktok, Inc., Consumer Privacy Litigation (Case: 1:20-cv-04699).  Shortly after the filing of the motion, objections were filed regarding the basis and terms of the settlement.  After a hearing on March 3, 2021, the Court requested a supplemental briefing from the parties explaining, amongst other concerns: how the parties arrived at the final $92 million figure; how they addressed differences between adult users and minor users for purposes of the settlement; and, an additional explanation as to why class members could not be notified of the deal through the TikTok app itself.

So, what’s the scoop?  Let’s backtrack first.

Recall that on August 8, 2020, the Panel on Multidistrict Litigation consolidated 10 pending class action cases against TikTok Inc.  Five cases in the Northern District of California, four in the Northern District of Illinois, and one in the Southern District of Illinois, as In Re: Tiktok, Inc., Consumer Privacy Litigation (Case: 1:20-cv-04699).  The multidistrict litigation has now expanded to 21 putative class actions against Defendants TikTok Inc., ByteDance Technology Inc., and foreign Defendants TikTok, Ltd. (previously known and sued as Musical.ly), and Beijing ByteDance Technology Co. Ltd.  All actions concern Defendants’ collection, use, and transmission of highly sensitive personal data via Defendants’ ubiquitous TikTok app.  Plaintiffs in the consolidated actions allege that Defendants’ conduct with, respect to the scanning, capture, retention, dissemination of the facial geometry and other biometric information of users of the app is in violation of multiple privacy statutes.  Plaintiffs also allege that Defendants use private information to track and profile TikTok users among other things, for ad targeting and profit.  In other words: this case is obviously a big deal in the world of data privacy litigation.

The operative Consolidated Amended Class Action Complaint, filed on December 18, 2020, alleges ten separate causes of action against Defendants, including, violation of the Illinois Biometric Information Privacy Act (“BIPA”), the Computer Fraud and Abuse Act (“CFAA”), the California Comprehensive Data Access and Fraud Act (“CDAFA”), the right to privacy under the California Constitution, and California Unfair Competition Law (“UCL”), among others.

In response, TikTok has argued that it does not and never has collected from its users any biometric identifiers or derivative information protected by law, nor has it ever shared U.S. user data with foreign government third-parties.  In support of its position that it has not violated BIPA, TikTok has asserted that its user video data is not used to identify anyone, as app users cannot tag or label faces in videos with a user’s real name or identity.

The objections that were recently filed include that the settlement does not account for serious conflicts between minor class members, nationwide class members, and Illinois subclass members, arising out of: (a) the minors’ abilities to disaffirm any arbitration agreement or class action waiver, and (b) TikTok’s statutory obligation to delete data collected about them. The objections also include that the proposed settlement value is unfair, unreasonable, inadequate, and is far below the net expected value of continued litigation.  Objectors also attack the notice plan as violating both Rule 23 and Due Process. Objecting Plaintiffs allege that the notice plan improperly relies on publication notice when direct notice should be via the TikTok app itself.  They also claim the notice plan fails to set forth the estimated claim value, and appears to be designed to suppress the claims and objections rates.

How will this all shake out?  Will the court ultimately be satisfied that the settlement is fair and otherwise meets the various requirements necessary for preliminary approval?  Supplemental briefs from the parties are due by March 23, and a status and motion hearing is set for April 6.

© Copyright 2020 Squire Patton Boggs (US) LLP


For more articles on Tik Tok, visit the NLR Litigation / Trial Practice section.

Singapore Academy of Law Considers the Impact of Robotics and Artificial Intelligence on the Law

The Law Reform Committee (LRC) of the Singapore Academy of Law (SAL) established a Subcommittee on Robotics and Artificial Intelligence to consider and make recommendations regarding the application of the law to AI systems. The LRC is considering whether existing systems of law, regulation, and wider public policy remain “fit for purpose,” given the pace and ceaselessness of change of the AI field. The LRC published two reports in July 2020, one report in September 2020, and one report in February 2021:

  • “Applying Ethical Principles for Artificial Intelligence in Regulatory Reform”;
  • “Rethinking Database Rights and Data Ownership in an AI World”;
  • “Report on the Attribution of Civil Liability for Accidents Involving Autonomous Cars”; and
  • “Report on Criminal Liability, Robotics and AI Systems.”

This initiative is part of the report series on “impact of robotics and artificial intelligence on the law” to stimulate systematic thought and debate on these issues and discussions between policy makers, legislators, industry, the legal profession, and the public to adopt legislation in line with the evolution of AI. The remaining two reports of the series cover application of criminal law to the operation of AI systems and technologies, and attribution of civil liability for accidents involving automated cars.

This article examines each report and highlights issues currently under consideration that may impact industries in Singapore whose business models, operations, or products may rely on AI systems and/or robotics.

REPORT 1: APPLYING ETHICAL PRINCIPLES FOR ARTIFICIAL INTELLIGENCE IN REGULATORY REFORM

Report 1 by the Subcommittee identifies issues that law and policy makers may face in applying ethical principles when developing or reforming policies and laws regarding AI. The primary objective of this report is to advance a public discussion about how those ethical principles can be incorporated into the development of “fair, just, appropriate and consistent laws, regulations and ‘soft law’ measures that foster technological development that prioritises human wellbeing and promotes human dignity and autonomy.” Specifically, the report discusses the following ethical principles that should be relevant for legal reform for AI:

Law and Fundamental Interests

AI systems should be designed and deployed to comply with law and not violate established fundamental interests of persons protected by law — the two main issues with regard to liability of AI systems are the (i) lack of mental state of the relevant actor such as knowledge or intention attributable to a person and (ii) a “decision” by an AI system to act is the result of a long causation chain involving different actors at different stages of the system’s creation and deployment.

Considering AI Systems’ Effects

Designers and deployers of AI systems should consider the likely effects of reasonably foreseeable effects of AI systems throughout their lifecycle. It is possible that existing principles are sufficient and could be relied upon to fairly apportion liability. However, policy makers may require more tailor-made interventions by creating principles specific to certain scenarios.

Wellbeing and Safety

AI systems should be rational, fair, and without intentional or unintentional biases. It is necessary to assess AI systems’ intended and unintended effects against holistic wellbeing and safety metrics and minimize harm by considering factors such as human emotions, empathy, and personal privacy.

Risk Management to Human Wellbeing

It is imperative for designers and deployers of AI systems to properly assess and eliminate or control risks of the use of AI systems as a matter of safety and wellbeing. Policymakers will need to consider whether mandatory risk management standards need to be imposed, and if so, the form in which, and specificity with which, such standards are articulated.

Respect for Values and Culture

AI systems should be designed to take into account, as far as reasonably possible, societal values and cultural diversity and values in different societies in AI deployment. Taking into account societal values and cultural norms is especially important in effective AI systems.

Transparency

Designing AI systems to be transparent as far as reasonably possible and to enable discovery of how and why an AI system made a particular decision or acted the way it did. Transparency entails being able to trace, explain, verify, and interpret all aspects of AI systems and their outcomes insofar possible. The objective is to not only properly regulate AI but also to build trustworthy AI. One possible regulatory response to challenges involving tracing, explaining, and verifying different aspects of AI is to require mechanisms to be built into AI systems that, as far as reasonably possible, record input data and provide a logic behind decisions taken by the AI, very much like a plane black-box recorder.

Accountability

Holding appropriate persons accountable for the proper functioning of AI systems based on their roles, the context, and consistency with the state of art.

Ethical Data Use

Good privacy and personal data management practices to protect the personal data of individuals.

REPORT 2: RETHINKING DATABASE RIGHTS AND DATA OWNERSHIP IN AN AI WORLD

Report 2 by the Subcommittee identifies key data-related and intellectual property laws on databases and data ownership, especially those that relate to “big data” databases used for AI systems. Any deficiencies in laws on data or databases may have ripple effects on laws managing AI systems.

Databases

Existing Legal Protections

The Subcommittee analyses whether the protection of databases under copyright and patent law is adequate. Current protection in Singapore is limited to elements that meet the requisite level of originality (i.e., application of intellectual effort, creativity, or the exercise of a mental labor, skill, or judgment). In contrast, big data compilations do not have a single author; rather, they consist of automated data collected into raw machine-generated databases. The focus on the creative element excludes from protection valuable databases.

Recommendations

Introduction of a sui generis database right1 is not appropriate under Singapore law given the limited evidence of its effectiveness. The Subcommittee recommends that (i) copyright protection of computer-generated works be recognized and (ii) greater clarity as to how compilation rights apply for the copyright protection and how records of authorship of databases can be properly maintained.

Data Ownership

Current Status

The report reviews whether data collected by AI, whether as individual data or a combination of data elements, need to be granted property rights. Personal data is protected in Singapore under the Personal Data Protection Act (PDPA), but even if the data subject enjoys certain protection, he/she is not granted legal ownership of his/her data. Given the nature of data, there are fundamental difficulties—on grounds of jurisprudential principle and policy—to using ownership and property rights as legal frameworks to control data.

Merits of Granting Property Rights Over Personal Data

There are various arguments for granting property rights over data, such as providing a clear and coherent method to protect privacy and relying on existing property laws to provide established protection. Currently, data is protected through a mix of copyright, confidentiality, and privacy laws.

Recommendations

The report concludes that creating a property right for data is not desirable due to the conceptual challenges of data’s intangibility. Introducing particular rights or entitlement over personal data can be achieved by other means than ownership (e.g., data portability obligation under the PDPA). Specific data control methods can be implemented to protect individual rights as well as to support data innovation.

REPORT 3: ATTRIBUTION OF CIVIL LIABILITY FOR ACCIDENTS INVOLVING AUTONOMOUS CARS

Under consideration by regulators are questions regarding the attribution of civil liability when accidents or collisions involving autonomous cars occur and cause injury or death, even though it is hoped that autonomous vehicles will significantly reduce the number of accidents on public roads.

At present (i.e., for car accidents involving human drivers), Singapore law applies a fault-based negligence framework: the person most responsible for the accident is held liable (that liability then typically being covered by motor insurance).

For self-driving cars, many events leading up to an accident may stem from decisions made by the car’s autonomous features, with no human input or intervention whatsoever. As the car cannot be meaningfully held accountable and sued directly, it becomes important whether to attribute liability to either the car’s manufacturer, the manufacturer of the components that did not function properly, or the car’s owner or user.

Authorities in various overseas jurisdictions have taken recent steps to review and reform aspects of their laws to accommodate the arrival on public roads of, in the first instance, conditionally autonomous cars—where a human driver is still required to take back control if necessary.

To date, the approach in Singapore has been to introduce “sandbox” regulations to promote innovation in autonomous car technologies in Singapore rather than seeking to legislate now for future mainstream use. However, different liability frameworks presently used in other areas of law in Singapore (i.e., negligence, product liability, and no-fault liability) have yet to be applied to autonomous vehicles.

Negligence

Typically, negligence-based laws require the establishment of (a) a duty of care (foreseeability of harm), (b) a breach of that duty (standard of care), and (c) recoverable damage. However, failures of software present a challenge and render the question of breach much more complicated to resolve.

Product Liability

Such regime is focused on dangerous product defects and manufacturers’ failure to adopt reasonable product designs that mitigate foreseeable risks of harm—such regime is well developed in Europe but is less well developed than negligence in Singapore law. In Singapore, the committee considers that strict liability is likely to have an adverse impact on the availability and cost of insurance and have a risk of stifling innovation. In addition, for Singapore, moving to a novel strict-liability regime from one based on negligence may involve significant transition costs, even if it were limited to self-driving car accidents.

No-Fault Liability

No-fault liability simply requires that if the harm was suffered due to the accident, compensation for the victim follows as a matter of course. The relative simplicity of a no-fault liability regime makes it initially attractive as a means to address the conceptual problems that self-driving cars create. However, the requirements under the current law to prove certain legal and evidential issues should not be disregarded, and so completely abandoning them would change existing legal paradigms.

According to the committee, given Singapore’s long-established negligence-based liability regime and the potential transition costs entailed in adopting a wholly new model, the more productive approach may therefore be to retain the existing system but make targeted modifications to import the desirable features of product liability and no-fault liability, where appropriate. Given this, and the fact that no other jurisdiction has yet identified a comprehensive and convincing liability framework for motor accidents involving autonomous vehicles (regardless of their level of automation), a sui generis regime may be required for Singapore.

REPORT 4: CRIMINAL LIABILITY, ROBOTICS AND AI SYSTEMS

Attribution of criminal liability to a person generally requires both a wrongful act (or, in certain cases, omission) and a mental element on the part of the person carrying out the act. That fault element, also known as “mens rea,” may involve intention, wilfulness, knowledge, rashness, or negligence.

Autonomous robotic and artificial intelligence (RAI) systems are increasingly being deployed, which can raise challenges in attributing criminal liability and holding someone responsible where harm is caused. However, while criminal liability can be imposed on natural or legal persons—and thus on both humans and corporate entities—an RAI system is not a legal person on which criminal responsibility could be placed directly.

Therefore, questions arise as to (a) which aspect of the RAI system factually caused it to act the way it did (resulting in harm), (b) which party (or parties)—be that the system manufacturer, the system owner, a component manufacturer, or a software developer—was responsible for that aspect, and (c) whether that party could have foreseen or mitigated the harm.

For RAI systems, it is useful to distinguish between cases of intentional criminal use of (or interference with) the RAI system and those where nonintentional criminal harm is caused.

For Intentional Criminal Harm

Current legislation will be applicable and could be improved but may not drastically change.

For Nonintentional Harm

In Singapore, certain offences can be satisfied when a person is criminally negligent. However, even if some existing Singapore negligence-based offences in the Penal Code could be used for RAI systems, other type of harms might not fall within the existing framework. With more complex RAI systems, it may be very difficult (in some instances, practically impossible) to establish definitively the process by which the RAI system determined to take a particular action.

Therefore, the committee has considered other mechanisms to be implemented in Singapore for RAI criminal liability:

Legal Personality of RAI Systems

One possibility that has been debated is the creation of a new form of legal personality for RAI systems, such that criminal liability could be imposed directly on the RAI system itself. However, it is unclear, for example, how imposing criminal liability and sanctions on an RAI system directly would “punish,” “deter,” or “rehabilitate” the system itself. And if the objective is instead to deter or penalize those responsible for the RAI system, that could arguably equally be achieved through legal mechanisms that do not require new forms of legal personality to be created.

New Offences for Computer Programs

The new offences could target the creation of risk by developers or operators of computer programs through their rash or negligent creation or impose a duty on those with control over a computer program to take reasonable steps to cease harms that may result from computer programs after they manifest. This approach could allow courts to identify the persons criminally liable and the parameters of their duties. However, the contours of such offences remain uncertain, and such approach could deter innovation.

Workplace Safety Legislation as a Model

This is a model where duties are imposed on specified entities to take, so far as is reasonably practicable, such measures as are necessary to avoid harm. There is a focus on whether the relevant entity breached its statutory duty to take all reasonably practicable measures to avoid the harm. Ultimately, whether and when it is justified to place such an onus on those responsible for RAI systems is a policy judgment for lawmakers, balancing demands for accountability with the desire not to unduly stifle innovation and impede the societally beneficial development and use of RAI systems.

NEXT STEPS

The reports of the SAL are intended to encourage systematic thought and debate between various policymaking and industry stakeholders such that public policy on AI remains close to the commercial use of AI. If you wish to get in touch with policy makers, please contact us.

Sui generis database right is a right that exists in the European Union to recognize the investment that is made in compiling a database.

Copyright 2020 K & L Gates


For more, visit the NLR Global news section.

Five Tips for Enhancing Your Virtual Proceedings

In 2020, you learned to provide advocacy for your clients outside of the courtroom through use of online platforms. Conducting virtual trials and arbitrations has now become a common procedure. If you are looking for ways to improve the quality of your online proceeding, I have five pieces of advice to offer you.

1. Check Your Setup.

You don’t want any tech mishaps during your virtual proceeding. Take the time to prepare your setup so that everything goes off without a hitch. You will want an external monitor—or two—connected to your laptop. Use one screen to see all the participants and the other screen to show exhibits displayed on the screen share. A quality webcam is a must-have, ideally one with 1080p resolution. You will need clear audio too, and I recommend either headphones embedded with a microphone or a USB microphone attached to your computer. If you’re able to work from an office, turn a conference room into your “presentation room.” Set up a dedicated videoconference computer with a simple backdrop and have everyone examine witnesses there. Finally, a week or two before your proceedings begin, test out your equipment and software, and make sure all the features work for every party in the proceeding.

2. Get a Host.

If you want the smoothest virtual proceeding, you’ll need a host. Most courts have a host for online proceedings, but for arbitrations, the parties may need to find their own. All parties should agree on one neutral vendor to act as host. The host will then assign a technician to focus on connectivity, security, technical troubleshooting, and ensuring everything flows well. Having this agreement in place will alleviate virtually all of your potential technical problems.

3. Use a Hot Seat Operator.

As with an in-person hearing, a hot seat operator is vital when proceedings are virtual. The host and hot seat operator should not be the same person. This way, the hot seat operator can focus on pulling up exhibits, showing demonstratives, and running video clips, instead of on troubleshooting connectivity. You don’t need the hot seat operator with you physically in order to present evidence. Last year, I worked on a virtual arbitration; the host was in Florida, the attorneys were in Europe, and I was in the hot seat, presenting evidence from California—and it all worked perfectly.

4. Pick a Platform.

There are many video conference platforms available. Most courts are using Zoom, Webex, or Skype. However, if you are planning a virtual arbitration, you will likely have a choice. After using a number of platforms, my personal preference is Zoom. It is the industry standard, and for a good reason. It has a user-friendly interface and easy screen-sharing capabilities. Zoom is also highly secure. It has 256-bit TLS (Transport Layer Security) encryption, and all shared content can be encrypted. With these security measures, along with a meeting password and the waiting room enabled, you will have a secure meeting.

5. Focus on Details.

Practice makes perfect for a virtual proceeding. Examining witnesses over a virtual platform is an art form, so make sure you rehearse over a videoconference. Additionally, it’s critical to be prepared, especially when you are operating in a remote setting. Set up a messaging service such as Slack or Microsoft Teams, so your whole team can easily communicate in writing while your arbitration is in session. You don’t want messages popping up all the time when you are sharing your screen, so make sure you’ve turned off notifications or consider using a separate laptop for messaging with your team. Be prepared by making backups of all your important documents and have a backup laptop and a backup Wi-Fi provider, such as a cellular hot spot, in case the internet goes out while you’re in session.

Conducting your proceedings over a virtual platform can offer you so many benefits. First, it allows you to continue to serve the needs of your clients despite court closures. Second, it is allows you forgo travel expenses and spend more time at home. Who doesn’t want more time and money? Most importantly, when you put in place an expert team and follow the tips I’ve provided, virtual proceedings offer you a way to vigorously and effectively advocate for your client.

© Copyright 2002-2020 IMS ExpertServices, All Rights Reserved.


For more, visit the NLR Law Office Management section.

Ransomware Incident Compromises Unemployment Claim Information of 1.6M in WA

It is being reported that the Office of the Washington State Auditor (SAO) is investigating a security incident, allegedly caused by a third-party vendor, that may have compromised the personal information of up to 1.6 million residents of the state of Washington who filed unemployment claims in 2020.

The SAO is investigating fraudulent unemployment claims filed in Washington in 2020 that reportedly cost the state up to $600 million. In completing the audit, the state utilized a third-party vendor, Accellion, to transmit computer files for the investigation.

According to the SAO, “during the week of January 25, 2021, Accellion confirmed that an unauthorized person gained access to SAO files by exploiting a vulnerability in Accellion’s file transfer service.” The SAO posted on its website that the unauthorized person “was able to exploit a software vulnerability in Accellion’s file transfer service and gain access to files that were being transferred using Accellion’s service,” which occurred in December 2020.

Data that may have been affected includes 1.6 million individuals’ claims made between January 1, 2020 and December 10, 2020, including claims made by state employees. The compromised information includes individuals’ names, Social Security numbers and/or drivers’ license or state ID numbers, bank information and place of employment. In addition, the personal information of some individuals whose information was held by the Department of Children, Youth and Families was also compromised.

What a terrible consequence for those who legitimately lost their job and filed for unemployment benefits. For those whose personal information was used to file a fraudulent unemployment claim, this news throws a massive amount of salt in the wound of being the victim of identity theft.


Copyright © 2020 Robinson & Cole LLP. All rights reserved.
For more, visit the NLR Communications, Media & Internet section.

CCPA for Lawyers: Notice Of Collection Needed for Third-Party Subpoenas & Discovery Req?

CCPA Illogic: Do lawyers have to give notices of collection before sending out third party subpoenas?

A law firm may be considered a service provider under the CCPA to the extent that a written contract between the law firm and its client (e.g., an engagement letter) prohibits the law firm from using, retaining, and disclosing personal information except to the extent permitted by the client. As the CCPA only requires that a “business that collects a consumer’s personal information” provide a notice at collection,1 if a law firm is a service provider it would not be required to provide a notice at collection to individuals from whom it is attempting to collect personal information.

If, on the other hand, a law firm is considered a business it is possible that it is exempt from the requirement to provide a notice at collection. Specifically, businesses are exempt from any obligations under the CCPA to the extent that they “restrict a business’s ability to . . . exercise or defend legal claims.”2 A court might determine that requiring a law firm to provide a notice at collection restricts the law firm’s ability to exercise or defend legal claims on behalf of clients, or restricts clients ability to have their claims exercised or defended by the law firm.

Even if a law firm is not exempt from the obligation to provide a notice at collection, assuming that the target of the subpoena is a California consumer the subpoena itself may implicitly satisfy the obligation to provide a notice at collection. Specifically, a notice at collection should include the following information:

  • A list of the categories of personal information that will be collected;
  • The business or commercial purpose for which the information is being collected;
  • Information on how to opt-out of the sale of personal information (if information is being sold); and
  • Information on how to find the company’s complete privacy notice.3

A third party subpoena, by its nature, specifies the type of personal information that is being sought, and that the information will be used within the context of the identified litigation. While a subpoena does not specify how a recipient can opt out of the sale of their personal information, discovery and ethics rules prevent a law firm from attempting to sell personal information received in discovery. While most subpoenas do not specifically indicate how a subpoena recipient can find a copy of the law firm’s privacy notice, if a recipient is represented by counsel, it would be difficult to argue that their counsel would not know how to locate a law firm’s online privacy notice to the extent that one has been posted. The net result is that most, if not all, of the information required by a notice at collection may be contained within a subpoena.4

CCPA Illogic: Do lawyers have to give notices of collection before sending out discovery requests?

A law firm may be considered a service provider under the CCPA to the extent that a written contract between the law firm and its client (e.g., an engagement letter) prohibits the law firm from using, retaining, and disclosing personal information except to the extent permitted by the client. The CCPA only requires that a “business that collects a consumer’s personal information” provide a notice at collection.5 As a result, if a law firm is a service provider, it would not be required to provide a notice at collection to individuals from whom it is attempting to collect personal information.

If, on the other hand, a law firm is considered a business, it is possible that it is exempt from the requirement to provide a notice at collection. Specifically, businesses are exempt from any obligations under the CCPA to the extent that they “restrict a business’s ability to . . . exercise or defend legal claims.”6 A court might determine that requiring a law firm to provide a notice at collection restricts the law firm’s ability to exercise or defend legal claims on behalf of clients, or restricts clients ability to have their claims exercised or defended by the law firm.

Even if a law firm is not exempt from the obligation to provide a notice at collection, assuming that the opposing party is a California consumer a discovery request may implicitly satisfy the obligation to provide a notice at collection. Specifically, a notice at collection should include the following information:

  • A list of the categories of personal information that will be collected;
  • The business or commercial purpose for which the information is being collected;
  • Information on how to opt-out of the sale of personal information (if information is being sold); and
  • Information on how to find the company’s complete privacy notice.7

A discovery request (e.g., interrogatives, document requests, or a deposition request) specifies the type of personal information that is being sought, and implicit in the discovery request is that the information will be used within the context of the litigation. While a discovery request does not specify how an opposing party can opt out of the sale of their personal information, discovery and ethics rules often prevent a law firm from attempting to sell personal information received in discovery.8 While most discovery requests do not specifically indicate how an opposing party can find a law firm’s complete privacy notice, if an opposing party is represented by counsel it would be difficult to argue that opposing counsel would not know how to locate a law firm’s privacy notice to the extent that it is publicly posted online. The net result is that most, if not all, of the information required by a notice at collection may be contained in a discovery request itself.9


1 Cal. Civ. Code 1798.100(b) (Oct. 2020) (emphasis added).
2 Cal. Civ.  Code 1798.145(a)(5).
3 CCPA Reg. 999.305(b)(1)-(4).
4 Note that as of January 1, 2023, a notice at collection would also need to include the “length of time” that the business intends to retain each category of personal information. Cal. Civ.  Code 1798.100(a)(3).  In the context of civil litigation, the length of time that information will be kept is often conveyed to the opposing party through other means such as a negotiated protective order that discusses the return or destruction of documents at the end of the litigation.

5 Cal. Civ. Code 1798.100(b) (Oct. 2020) (emphasis added).
6 Cal. Civ.  Code 1798.145(a)(5).
7 CCPA Reg. 999.305(b)(1)-(4).
8 For example, ABA Model Rule of Professional Ethics 4.4(a) prohibits a lawyer from using any method of obtaining evidence that would “violate the legal rights” of a third party.
9 Note that as of January 1, 2023, a notice at collection would also need to include the “length of time” that the business intends to retain each category of personal information. Cal. Civ.  Code 1798.100(a)(3). In the context of civil litigation, the length of time that information will be kept is often conveyed to the opposing party through other means such as a negotiated protective order that discusses the return or destruction of documents at the end of the litigation.


For more, visit the NLR Law Office Management section

Political Action Committee & Personal Political Contributions Become the Next Reputational Challenge for Law Firms & Their Clients

Aesop perhaps said it best: “You are known by the company you keep.” It appears many organizations are learning the true meaning of that phrase in the wake of the Republican vote against certification of the Electoral College results and the January 6 U.S. Capitol riots.

In a mere week’s time, corporate giants including Marriott International, Dow, JPMorgan, American Express, Nike, Google, Facebook and Microsoft have publicly declared they are pausing contributions from their political action committees (PACS). They are joined by a growing chorus that contains some of the world’s most well-known brands. While most of these organizations have targeted the members of Congress who voted against certification, many are making larger declarations, including Charles Schwab, which announced it is shutting down its PAC and donating the money to charity and to historically Black colleges and universities.

Since the first PAC was established in 1943 by the Congress of Industrial Organizations after Congress prohibited unions from donating directly to political candidates, PACs have been a strategic tool to help law firms, corporations, banks, unions, trade associations and others achieve strategic business objectives affected by the laws and regulations that govern – or hinder – their growth. Corporate PACs, at companies like those listed above, rely on voluntary contributions from employees – and that is likely one of the reasons the decisions announced this past week came so swiftly. It is challenging to keep employees motivated – or to keep them at all – if they suddenly find that their own values are diametrically opposed to those held by the organization they work for.

For an example of how employee values can shape corporate decision making, read this piece we wrote when household goods retailer Wayfair ran into an employee buzz saw after it was discovered the company was supplying bedroom furniture to a federal detention center in Texas. Note too, this story describing the pullback by law firms including Porter Wright and Jones Day after colleagues in the firms raised concerns about their work on the 2020 election challenges.

Aside from employee pushback, the values of other stakeholders that organizations prize no doubt factored into the decisions regarding PAC contributions as well. Those important audiences include customers and clients, investors, suppliers and even the communities in which these organizations operate. Here, social media’s power to harness and broadcast stakeholder outrage are important factors for the PAC distribution committee to consider.

No doubt some of the PAC decisions also were colored by the fact that PAC contributions are now relatively easy to uncover. The Center for Responsive Politics, for instance, hosts a website that makes it easy to discover, by year, how much individual organizations have donated to which parties and to which House and Senate candidates or incumbents. Access to comparable information at the state level varies, but likely will move toward more transparency given recent events. All the above is true, as well, for individuals making political contributions, apart from their PAC contributions. A quick visit to www.fec.gov/data/ opens a page with a simple enter-a-name-here search box and within seconds, one can see campaign donations made by co-workers, friends, competitors, spouses, children, extended relatives and celebrities. Similar easy-to-search databases are available at the state level and most counties across the country.

Combine this access to information with social media’s role as the global town crier and it’s naïve at best to assume no one will notice an individual or PAC’s significant contribution to a recipient of note – especially one with a highly controversial position on high profile issues or a questionable voting record.

While there are many reasons why an individual or organization might decide to support a specific lawmaker, those reasons may not be as readily apparent to stakeholders (including employees), the media or the public.  If yours is not one of the many organizations that have publicly announced that they are withdrawing some or all of their PAC support, now would be a good time to get ready to explain why you’ve supported the individuals you have, and what your path going forward may be. Here are some messages to consider:

  • How does this recipient’s voting record and position align with your organization’s mission and values? How have your contributions helped your organization grow and thrive so it can better serve its stakeholders?
  • If your organization has a strong commitment to corporate social responsibility, how do these contributions support that work?
  • If there are other reasons you support this individual, what are they?
  • If there are reasons why you no longer support this individual, what prompted you to end your support?

In a similar manner, if your organization took a public position in support of hot-button issues like Black Lives Matter and #MeToo, but your political contributions speak otherwise, how will you address that discrepancy (which is likely to be defined by others as hypocrisy)?

If your organization stands behind its record of political support, be prepared to defend that record with transparency and honesty.  And, be prepared to do so before media and social media seize the advantage they have in galvanizing opinion quickly. While your PAC – or the personal checks you’ve written – may be only one small portion of your organization’s government affairs program, these days, it’s the one everyone seems to be talking about.


The views and opinions expressed in posting are those of the author and do not necessarily reflect the views or position of the National Law Review, the National Law Forum LLC  or any of its affiliates.  

© 2020 Hennes Communications. All rights reserved.


Esports: What We Should Expect in 2021

The esports ecosystem experienced transcendental growth in 2020 due at least in part to the Covid-19 pandemic, and is poised to act as a spring board for even further growth this year. With traditional sports largely sidelined last year, stadiums closed to fans, and people starving for personal interaction, gamers and spectators alike have turned to esports in record numbers.  According to Newzoo, a prominent esports analytics company, 22% of the internet population participates in esports, and global gaming revenue is expected to hit $159 billion by the end of 2020.[1] Streamers and streaming platforms have exploded in popularity, allowing streamers to earn income from broadcasting their live gameplay, interact with fans and engage with other players.

Building on the tremendous growth in 2020, here are some trends that some prominent members of the esports community are forecasting for 2021.

Significant Shift in Brand Advertising.

Enthusiast Gaming, a North American gaming platform went entirely virtual in 2020 and sponsored a four-day free EGLX tournament in November, 2020 that was watched by over 12 million people around the world.[2]  SpiderTech, and G Fuel were among the key sponsors of the event, which featured musical performances by Zhu and Goldlink and virtual appearances by athletes Richard Sherman and Darius Slay.[3]  With the enormous success of events such as EGLX, and esports audiences continuing to skyrocket, Enthusiast Gaming forecasts that mainstream brands will significantly increase their advertising spend to sponsor esports tournaments and events as a necessary means to engage with that tough-to-reach Gen-Z audience.[4]

Convergence of esports with Mainstream sports.

Other industry experts have predicted that the world will see also greater convergence between traditional sports and esports, with professional football teams launching their own esports teams.[5]  For example, in December 2020, the Philadelphia Eagles named Esports Entertainment Group (“EEG”) as their official esports tournament provider.[6]  As part of a multi-year deal, EEG will operate bi-annual Madden esports tournaments for the Eagles.[7]  EEG will collaborate with Eagles players to create videos to promote the tournaments and will feature Eagles players in increased digital marketing efforts.  First-movers such as the Philadelphia Eagles are likely to spawn increasing connectiveness between esports and traditional sports teams.

Growth of Esports in Popular Culture.

On April 24, 2020, more than 12 million people attended rapper Travis Scott’s virtual concert in Fortnite.[8]  Last year, FaZe Clan, one of the world’s most popular and successful professional gaming teams, entered the film industry and formed FaZe Studios, which plans to create feature films and a scripted television series.[9]  In June, 2020, FaZe Clan also announced its co-ownership of CTRL, a food supplement company.[10]  Earlier this year, the NBA sponsored the first-ever players-only esports tournament in which sixteen NBA stars competed in an NBA2K20 tournament on Xbox, won by Devin Booker, who earned $100,000 to donate to the charity of his choice.[11]  Based on the success of events such as these, 2021 is likely going to experience a surge in the integration of esports with popular culture, as the music, apparel and film industries seek to integrate themselves into gaming communities through in-game interactions.[12]

Increased Fragmentation and Evolution.

Other industry experts such as Spiketrap have observed that more and more people are streaming a greater variety and volume of content than ever before.[13]  For example, according to one source, 42% of the U.S. population has live-streamed online content (compared with just 25% in 2017), and live-streaming is expected to be a $70.5 billion industry by 2021.[14]  Spiketrap predicts greater fragmentation in the esports industry created by the explosive growth in the source, variety and content of live-streaming.[15]  Musicians, athletes and other content creators will need to find a way to integrate and leverage live-streamed content with their own to better connect and expand their relationships with fans and spectators.

Increased Participation in the Ecosystem.

Still other industry experts have observed an unprecedented increase in player participation within the esports ecosystem.  Esports One, for example, has witnessed a rampant rise in virtual currency, rankings, badges, skins and image banners, as player-members seek to “flex” their muscles and show off their skill to their friends and fellow competitors.[16]  As esports mature, and more games are supported by more titles, Esports One predicts that there will be more opportunities for sponsorships, integration, and tournament prizes.[17]

In short, as tumultuous and dynamic as 2020 was socially, politically and epistemologically, 2021 promises to be an unprecedented year in the esports world.

FOOTNOTES

[1] https://secure.outsiderclub.com/299076?msclkid=3270be3ec8d81c46bfa7e1de34b4f7d0

[2] https://markets.businessinsider.com/news/stocks/enthusiast-gaming-s-online-gaming-festival-eglx-watched-by-over-12-million-fans-1029837940

[3] Id.

[4] https://www.thegamingeconomy.com/2020/12/07/predictions-2021-esports/

[5] https://www.thegamingeconomy.com/2020/12/07/predictions-2021-esports/

[6] https://www.bignewsnetwork.com/news/267267002/eagles-become-first-nfl-team-with-esports-tournament-provider

[7] https://www.bignewsnetwork.com/news/267267002/eagles-become-first-nfl-team-with-esports-tournament-provider

[8] https://www.cnn.com/2020/04/24/entertainment/travis-scott-fortnite-concert/index.html

[9] https://www.theverge.com/2020/4/28/21239034/faze-clan-studios-film-tv-projects

[10] https://twitter.com/fazeclan/status/1273692097384714240

[11] https://nba.2k.com/2k20/en-US/news/first-ever-nba-2k-players-tournament/

[12] https://www.thegamingeconomy.com/2020/12/07/predictions-2021-esports/

[13] https://www.thegamingeconomy.com/2020/12/07/predictions-2021-esports/

[14] https://livestream.com/blog/62-must-know-stats-live-video-streaming#:-:text=Nielson%27s%20U.S.%20Video%20360%20Report.rise%20from%2025%25%20in%202017

[15] https://www.thegamingeconomy.com/2020/12/07/predictions-2021-esports/

[16] https://www.thegamingeconomy.com/2020/12/07/predictions-2021-esports/

[17] Id.


Copyright © 2020, Sheppard Mullin Richter & Hampton LLP.
For more, visit the NLR Entertainment, Art & Sports section.

Twitter fined $546,000 in December 2020 by European Data Protection Authority for 2019 Breach Notification Violations

The Irish Data Protection Commission (DPC) fined Twitter 450,000 euros (about US$546,000) for failing to timely notify the Irish DPC within the required 72 hours of discovering a Q4 2018 breach involving a bug in its Android app, and also for failing to adequately document that breach.  The bug caused some 88,726 European Twitter users’ protected tweets to be made public.

The case is notable because it is the first fine levied against a U.S. technology company in a cross border violation under the EU’s General Data Protection Regulation’s (GDPR), which went into effect in 2018.  Under the GDPR, the member state of the foreign company’s EU headquarters takes the lead on inquiries on behalf of all the EU’s 27 member states. Because Twitter EU’s headquarters are in Ireland, the DPC took the lead on the investigating the 2018 breach incident, which Twitter attributed to poor staffing during the holidays.

Pursuant to Article 60 of the GDPR, the Irish DPC submitted its draft decision last May to the other EU DPAs. In the draft decision, the Irish DPC found Twitter’s violations to be negligent, but not intentional or systematic.  Other member states disagreed with the Irish DPC draft decision, due in part to the small proposed fine.  The Irish DPC‘s proposed fine was only a small fraction of the maximum fine amount permitted, which under GDPR is up to 4% of a company’s global revenue or 20 million euros ($22 million), whichever is higher. Twitter’s global annual revenue was reportedly about $60 million in 2018.

The Irish DPC responded to the criticisms from other member states by stating that its proposed fine under the GDPR was an “effective, proportionate and dissuasive measure” and brought the matter before the European Data Protection Board, which upheld most of the decision but directed Ireland to increase the fine.

The Twitter case is just the first of many cases involving U.S. companies before the Irish DPC, as there are some 20 other pending inquiries. Ireland also serves as the EU headquarters for U.S. technology companies such as Facebook, Apple and Google.

The decision is available here.


Copyright © 2020 Robinson & Cole LLP. All rights reserved.
For more, visit the NLR Communications, Media & Internet section.

2020 In Review: An AI Roundup

There has been much scrutiny of artificial intelligence tools this year. From NIST to the FTC to the EU Parliament, many have recommendations and requirements for companies that want to use AI tools. Key concerns including being transparent about the use of the tools, ensuring accuracy, and not discriminating against individuals when using AI technologies, and not using the technologies in situations where it may not give reliable results (i.e., for things for which it was not designed). Additional requirements for use of these tools exist under GDPR as well.

Legal counsel may feel uncomfortable with business teams who are moving forward in deploying AI tools. It’s not likely, however, that lawyers will be able to slow down the inevitable and widespread use of AI. We anticipate more developments in this area into 2021.

Putting It Into Practice: Companies can use “privacy by design” principles to help them get a handle on business team’s AI efforts. Taking time to fully understand the ways in which the AI tool will be used (both immediately in any future phases of a project) can be critical to ensuring that regulator concerns and legal requirements are addressed.


Copyright © 2020, Sheppard Mullin Richter & Hampton LLP.
For more, visit the NLR Communications, Media & Internet section.

The U.S. Department of Justice Releases its Cryptocurrency Enforcement Framework

Earlier this year, the U.S. Department of Justice (“DOJ”) released its highly anticipated Cryptocurrency Enforcement Framework (the “Framework”).  The Framework was developed as part of the Attorney General’s Cyber-Digital Task Force, and contains three sections:  (1) Threat Overview; (2) Law and Regulations; and (3) Ongoing Challenges and Future Strategies.

The “Threat Overview” section details various illicit uses of cryptocurrency and highlights how criminals increasingly have used cryptocurrency to fund illicit and illegal activities, including purchasing and selling illegal drugs and firearms, funding terrorist organizations, laundering money, and engaging in other illegal activities on the dark web.  The Framework also discusses how hackers have targeted cryptocurrency marketplaces for theft and fraud activities.

The “Law and Regulations” section of the Framework details the existing statutory and regulatory framework that DOJ and others have used and can use to regulate cryptocurrency.  As the Framework explains, DOJ is not the only enforcement actor in this space, and many other agencies – including, among others, the U.S. Treasury Department, the Securities & Exchange Commission, the Commodity Futures Trading Commission, and the Internal Revenue Service – have been actively enforcing violations by criminal cyber actors.  While the Framework is generally supportive of a broad, multi-pronged enforcement landscape, it highlights the difficulty of tracking and complying with an increasingly complex web of regulations created by these various agencies.

The third and final section of the Framework discusses current challenges and strategies for future enforcement.  This section notes the inherently decentralized and cross-border nature of cryptocurrency, and the problems it poses for enforcement.  Though the global nature of cryptocurrency might complicate investigations, the Framework makes clear it will not hinder DOJ’s willingness or ability to prosecute cases, stating, “The Department also has robust authority to prosecute VASPs [Virtual Asset Service Providers] and other entities and individuals that violate U.S. law even when they are not located inside the United States.  Where virtual asset transactions touch financial, data storage, or other computer systems within the United States, the Department generally has jurisdiction to prosecute the actors who direct or conduct those transactions.”  The enforcement section emphasizes the Bank Secrecy Act (BSA) and Anti Money Laundering (AML) laws as primary tools of enforcement, particularly for actors who deal with “anonymity enhanced cryptocurrencies” and technology that obscures the ownership of particular assets.  The report stresses that obligations to safeguard systems, protect consumer data, and properly maintain customer information apply not only to conventional virtual asset exchanges, but also to peer-to-peer exchanges, kiosk operators, and virtual currency casinos.

The DOJ released the Framework at a time when interest in cryptocurrency is at an all-time high.  Bitcoin passed $20,000 recently, and the record-setting level is a clear indication of increased interest in the major digital asset.  Cryptocurrencies continue to attract an increasing number of investors, including well-known companies and fund managers.  Further, the CME announced plans to expand its cryptocurrency offerings by adding Ether futures to its existing Bitcoin futures, while the CBOE recently announced plans to launch indexes tied to various digital assets in early 2021.  The Framework represents a clear indication from the DOJ that it is focused on cryptocurrency-related crimes.  Individuals and companies seeking investment or exposure to the cryptocurrency market should review their compliance obligations in light of the Framework, and ensure any deficiencies are resolved quickly.

Commentators have noted the Trump administration’s aggressive stance towards cryptocurrency, and the Framework certainly tracks that stance.  Of course, it remains to be seen whether the Biden administration will continue to take such an aggressive enforcement posture in the cryptocurrency space.  Some commentators have noted that they expect that the Biden administration will be different.  Notably, Mr. Biden has chosen Gary Gensler to lead his financial policy transition team, and Mr. Gensler has been supportive of cryptocurrencies in past writings.


© 2020 Faegre Drinker Biddle & Reath LLP. All Rights Reserved.
For more, visit the NLR Communications, Media & Internet section.