Social Media’s Legal Dilemma: Curated Harmful Content

Walking the Line Between Immunity and Liability: How Social Media Platforms May Be Liable for Harmful Content Specifically Curated for Users

As proliferation of harmful content online has increasingly become easier and more accessible through social media, review websites and other online public forums, businesses and politicians have pushed to reform and limit the sweeping protections afforded by Section 230 of the Communications Decency Act, which is said to have created the Internet. Congress enacted Section 230 of the Communications Decency Act of 1996 “for two basic policy reasons: to promote the free exchange of information and ideas over the Internet and to encourage voluntary monitoring for offensive or obscene material.” Congress intended for internet to flourish and the goal of Section 230 was to promote the unhindered development of internet businesses, services, and platforms.

To that end Section 230 immunizes online services providers and interactive computer services from liability for posting, re-publishing, or allowing public access to offensive, damaging, or defamatory information or statements created by a third party. Specifically, Section 230(c)(1) provides,

No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

[47 U.S.C. § 230(c)(1)]

Section 230 has been widely interpreted to protect online platforms from being held liable for user-generated content, thereby promoting the free exchange of information and ideas over the Internet. See, e.g., Hassell v. Bird, 5 Cal. 5th 522 (2018) (Yelp not liable for defamatory reviews posted on its platform and cannot be forced to remove them); Doe II v. MySpace Inc., 175 Cal. App.4th 561, 567–575 (2009) (§ 230 immunity applies to tort claims against a social networking website, brought by minors who claimed that they had been assaulted by adults they met on that website]; Delfino v. Agilent Technologies, Inc., 145 Cal. App.4th 790, 804–808 (2006) (§ 230 immunity applies to tort claims against an employer that operated an internal computer network used by an employee to allegedly communicate threats against the plaintiff]; Gentry v. eBay, Inc., 99 Cal. App. 4th 816, 826-36 (Cal. Ct. App. 2002) (§ 230 immunity applies to tort and statutory claims against an auction website, brought by plaintiffs who allegedly purchased forgeries from third party sellers on the website).

Thus, under § 230, lawsuits seeking to hold a service provider liable for its exercise of a publisher’s traditional editorial functions—such as deciding whether to publish, withdraw, postpone or alter content—are barred. Under the statutory scheme, an “interactive computer service” qualifies for immunity so long as it does not also function as an “information content provider” for the portion of the statement or publication at issue. Even users or platforms that “re-post” or “publish” allegedly defamatory or damaging content created by a third-party are exempted from liability. See Barrett v. Rosenthal, 40 Cal. 4th 33, 62 (2006). Additionally, merely compiling false and/or misleading content created by others or otherwise providing a structured forum for dissemination and use of that information is not enough to confer liability. See, e.g. eBay, Inc. 99 Cal. App. 4th 816 (the critical issue is whether eBay acted as an information content provider with respect to the information claimed to be false or misleading); Carafano v. Metrosplash.com, Inc., 339 F.3d 1119, 1122-1124 (9th Cir. 2003) (Matchmaker.com not liable for fake dating profile of celebrity who started receiving sexual and threatening emails and voicemails).

Recently, however, the Third Circuit appellate court found that Section 230 did not immunize and protect popular social media platform TikTok from suit arising from a ten-year old’s death following her attempting a “Blackout Challenge” based on videos she watched on her TikTok “For You Page.” See Anderson v. TikTok, Inc., 116 F.4th 180 (3rd Cir. 2024). TikTok is a social media platform where users can create, post, and view videos. Users can search for specific content or watch videos recommended by TikTok’s algorithm on their “For You Page” (FYP). This algorithm customizes video suggestions based on a range of factors, including a user’s age, demographics, interactions, and other metadata—not solely on direct user inputs. Some videos on TikTok’s FYP are “challenges” that encourage users to replicate the actions shown. One such video, the “Blackout Challenge,” urged users to choke themselves until passing out. TikTok’s algorithm recommended this video to a ten-year old girl who attempted it and tragically died from asphyxiation.

The deciding question was whether TikTok’s algorithm, and the inclusion of the “Blackout Challenge” video on a user’s FYP, crosses the threshold between an immune publisher and a liable creator. Plaintiff argued that TikTok’s algorithm “amalgamat[es] [] third-party videos,” which results in “an expressive product” that “communicates to users . . . that the curated stream of videos will be interesting to them.” The Third Circuit agreed finding that a platform’s algorithm reflecting “editorial judgments” about “compiling the third-party speech it wants in the way it wants” is the platform’s own “expressive product,” and therefore, TikTok’s algorithm, which recommended the Blackout Challenge on decedent’s FYP, was TikTok’s own “expressive activity.” As such, Section 230 did not bar claims against TikTok arising from TikTok’s recommendations via its FYP algorithm because Section 230 immunizes only information “provided by another,” and here, the claims concerned TikTok’s own expressive activity.

The Court was careful to note its conclusion was reached specifically due to TikTok’s promotion of the Blackout Challenge video on decedent’s FYP was not contingent on any specific user input, i.e. decedent did not search for and view the Blackout Video through TikTok’s search function. TikTok has certainly taken issue with the Court’s ruling contending that if websites lose § 230 protection whenever they exercise “editorial judgment” over the third-party content on their services, then the exception would swallow the rule. Perhaps websites seeking to avoid liability will refuse to sort, filter, categorize, curate, or take down any content, which may result in unfiltered and randomly placed objectionable material on the Internet. On the other hand, some websites may err on the side of removing any potentially harmful third-party speech, which would chill the proliferation of free expression on the web.

The aftermath of the ruling remains to be seen but for now social media platforms and interactive websites should take note and re-evaluate the purpose, scope, and mechanics of their user-engagement algorithms.

U.S. Sues TikTok for Children’s Online Privacy Protection Act (COPPA) Violations

On Friday, August 2, 2024, the United States sued ByteDance, TikTok, and its affiliates for violating the Children’s Online Privacy Protection Act of 1998 (“COPPA”) and the Children’s Online Privacy Protection Rule (“COPPA Rule”). In its complaint, the Department of Justice alleges TikTok collected, stored, and processed vast amounts of data from millions of child users of its popular social media app.

In June, the FTC voted to refer the matter to the DOJ, stating that it had determined there was reason to believe TikTok (f.k.a. Musical.ly, Inc.) had violated a FTC 2019 consent order and that the agency had also uncovered additional potential COPPA and FTC Act violations. The lawsuit filed today in the Central District of California, alleges that TikTok is directed to children under age 13, that Tik Tok has permitted children to evade its age gate, that TikTok has collected data from children without first notifying their parents and obtaining verifiable parental consent, that TikTok has failed to honor parents’ requests to delete their children’s accounts and information, and that TikTok has failed to delete the accounts and information of users the company knows are children. The complaint also alleges that TikTok failed to comply with COPPA even for accounts in the platform’s “Kids Mode” and that TikTok improperly amassed profiles on Kids Mode users. The complaint seeks civil penalties of up to $51,744 per violation per day from January 10, 2024, to present for the improper collection of children’s data, as well as permanent injunctive relief to prevent future violations of the COPPA Rule.

The lawsuit comes on the heels of the U.S. Senate passage this week of the Children and Teens’ Online Privacy Protection Act (COPPA 2.0) and the Kids Online Safety Act (KOSA) by a 91-3 bipartisan vote. It is unknown whether the House will take up the bills when it returns from recess in September.

U.S. House of Representatives Passes Bill to Ban TikTok Unless Divested from ByteDance

Yesterday, with broad bipartisan support, the U.S. House of Representatives voted overwhelmingly (352-65) to support the Protecting Americans from Foreign Adversary Controlled Applications Act, designed to begin the process of banning TikTok’s use in the United States. This is music to my ears. See a previous blog post on this subject.

The Act would penalize app stores and web hosting services that host TikTok while it is owned by Chinese-based ByteDance. However, if the app is divested from ByteDance, the Act will allow use of TikTok in the U.S.

National security experts have warned legislators and the public about downloading and using TikTok as a national security threat. This threat manifests because the owner of ByteDance is required by Chinese law to share users’ data with the Chinese Communist government. When downloading the app, TikTok obtains access to users’ microphones, cameras, and location services, which is essentially spyware on over 170 million Americans’ every move, (dance or not).

Lawmakers are concerned about the detailed sharing of Americans’ data with one of its top adversaries and the ability of TikTok’s algorithms to influence and launch disinformation campaigns against the American people. The Act will make its way through the Senate, and if passed, President Biden has indicated that he will sign it. This is a big win for privacy and national security.

Copyright © 2024 Robinson & Cole LLP. All rights reserved.
by: Linn F. Freedman of Robinson & Cole LLP

For more news on Social Media Legislation, visit the NLR Communications, Media & Internet section.

Locking Tik Tok? White House Requires Removal of TikTok App from Federal IT

On February 28, the White House issuedmemorandum giving federal employees 30 days to remove the TikTok application from any government devices. This memo is the result of an act passed by Congress that requires the removal of TikTok from any federal information technology. The act responded to concerns that the Chinese government may use data from TikTok for intelligence gathering on Americans.

I’m Not a Federal Employee — Why Does It Matter?

The White House Memo clearly covers all employees of federal agencies. However, it also covers any information technology used by a contractor who is using federal information technology.  As such, if you are a federal contractor using some sort of computer software or technology that is required by the U.S. government, you must remove TikTok in the next 30 days.

The limited exceptions to the removal mandate require federal government approval. The memo mentions national security interests and activities, law enforcement work, and security research as possible exceptions. However, there is a process to apply for an exception – it is not automatic.

Takeaways

Even if you are not a federal employee or a government contractor, this memo would be a good starting place to look back at your company’s social media policies and cell phone use procedures. Do you want TikTok (or any other social media app) on your devices? Many companies have found themselves in PR trouble due to lapses in enforcement of these types of rules. In addition, excessive use of social media in the workplace has been shown to be a drag on productivity.

© 2023 Bradley Arant Boult Cummings LLP

University of Texas at Austin Permanently Blocks TikTok on Network

On Tuesday, January 17, 2023, the University of Texas at Austin announced that it has blocked TikTok access across the university’s networks. According to the announcement to its users, “You are no longer able to access TikTok on any device if you are connected to the university via its wired or WIFI networks.” The measure was in response to Governor Greg Abbott’s December 7, 2022, directive to all state agencies to eliminate TikTok from state networks. Following the directive, the University removed TikTok from university-issued devices, including cell phones, laptops and work stations.

Copyright © 2023 Robinson & Cole LLP. All rights reserved.

For  more Cybersecurity Legal News, click here to visit the National Law Review.

Nineteen States Have Banned TikTok on Government-Issued Devices

Governors of numerous states have issued Executive Orders in the past several weeks banning TikTok from government-issued devices and many have already implemented a ban, with others considering similar measures. There is also bi-partisan support of a ban in the Senate, which unanimously approved a bill last week that would ban the app from devices issued by federal agencies. There is already a ban prohibiting military personnel from downloading the app on government-issued devices.

The bans are in response to the national security concerns that TikTok poses to U.S. citizens [View related posts].

To date, 19 states have issued some sort of ban on the use of TikTok on government-issued devices, including some Executive Orders banning the use of TikTok statewide on all government-issued devices. Other state officials have implemented a ban within an individual state department, such as the Louisiana Secretary of State’s Office. In 2020, Nebraska was the first state to issue a ban. Other states that have banned TikTok use in some way are: South Dakota, North Dakota, Maryland, South Carolina, Texas, New Hampshire, Utah, Louisiana, West Virginia, Georgia, Oklahoma, Idaho, Iowa, Tennessee, Alabama, Virginia, and Montana.

Indiana’s Attorney General filed suit against TikTok alleging that the app collects and uses individuals’ sensitive and personal information, but deceives consumers into believing that the information is secure. We anticipate that both the federal government and additional state governments will continue to assess the risk and issue bans on its use in the next few weeks.

Copyright © 2022 Robinson & Cole LLP. All rights reserved.
For more Cybersecurity Legal News, click here to visit the National Law Review.

ANOTHER TRILLION DOLLAR CASE:? TikTok Hit in MASSIVE CIPA Suit Over Its Business Model of Profiting from Advertising by Collecting and Monetizing User Data

Data privacy lawsuits are EXPLODING and one of our country’s most popular mobile app — TikTok’s privacy issues keep piling up.

Following its recent $92 million class-action data privacy settlement for its alleged violation of Illinois Biometric Information Privacy Act (BIPA), TikTok is now facing a CIPA and Federal Wire Tap class action for collecting users’ data via its in-app browser without Plaintiff and class member’s consent.

The complaint alleges “[n]owhere in [Tik Tok’s] Terms of Service or the privacy policies is it disclosed that Defendants compel their users to use an in-app browser that installs JavaScipt code into the external websites that users visit from the TikTok app which then provides TikTok with a complete record of every keystroke, every tap on any button, link, image or other component on any website, and details about the elements the users clicked. “

Despite being a free app, TikTok makes billions in revenue by collecting users’ data without their consent.

The world’s most valuable resource is no longer oil, but data.”

While we’ve discussed before, many companies do collect data for legitimate purposes with consent. However this new complaint alleges a very specific type of data collection practice without the TikTok user’s OR the third party website operator’s consent.

TikTok allegedly relies on selling digital advertising spots for income and the algorithm used to determine what advertisements to display on a user’s home page, utilizes tracking software to understand a users’ interest and habits. In order to drive this business, TikTok presents users with links to third-party websites in TikTok’s in-app browser without a user  (or the third party website operator) knowing this is occurring via TikTok’s in-app browser. The user’s keystrokes is simultaneously being intercepted and recorded.

Specifically, when a user attempts to access a website, by clicking a link while using the TikTok app, the website does not open via the default browser.  Instead, unbeknownst to the user, the link is opened inside the TikTok app, in [Tik Tok’s] in-app browser.  Thus, the user views the third-party website without leaving the TikTok app. “

The Tik-Tok in-app browser does not just track purchase information, it allegedly tracks detailed private and sensitive information – including information about  a person’s physical and mental health.

For example, health providers and pharmacies, such as Planned Parenthood, have a digital presence on TikTok, with videos that appear on users’ feeds.

Once a user clicks on this link, they are directed to Planned Parenthood’s main webpage via TikTok’s in-app browser. While the user is assured that his or her information is “privacy and anonymous,” TikTok is allegedly intercepting it and monetizing it to send targeted advertisements to the user – without the user’s or Planned Parenthood’s consent.

The complaint not only details out the global privacy concerns regarding TikTok’s privacy practices (including FTC investigations, outright ban preventing U.S. military from using it, TikTok’s BIPA lawsuit, and an uptick in privacy advocate concerns) it also specifically calls out the concerns around collecting reproductive health information after the demise of Roe v. Wade this year:

TikTok’s acquisition of this sensitive information is especially concerning given the Supreme Court’s recent reversal of Roe v. Wade and the subsequent criminalization of abortion in several states.  Almost immediately after the precedent-overturning decision was issued, anxieties arose regarding data privacy in the context of commonly used period and ovulation tracking apps.  The potential of governments to acquire digital data to support prosecution cases for abortions was quickly flagged as a well-founded concern.”

Esh. The allegations are alarming and the 76 page complaint can be read here: TikTok.

In any event, the class is alleged as:

“Nationwide Class: All natural persons in the United State whose used the TikTok app to visit websites external to the app, via the in-app browser.

California Subclass: All natural persons residing in California whose used the TikTok app to visit websites external to the app, via the in-app browser.”

The complaint alleges California law applies to all class members – like the Meta CIPA complaint we will have to wait and see how a nationwide class can be brought related to a CA statute.

On the CIPA claim, the Plaintiff – Austin Recht – seeks an unspecific amount of damages for the class but the demand is $5,000 per violation or 3x the amount of damages sustained by Plaintiff and the class in an amount to be proven at trial.

We’ll obviously continue to keep an eye out on this.

Article By Puja J. Amin of Troutman Firm

For more communications and media legal news, click here to visit the National Law Review.

© 2022 Troutman Firm

Judge Approves $92 Million TikTok Settlement

On July 28, 2022, a federal judge approved TikTok’s $92 million class action settlement of various privacy claims made under state and federal law. The agreement will resolve litigation that began in 2019 and involved claims that TikTok, owned by the Chinese company ByteDance, violated the Illinois Biometric Information Privacy Act (“BIPA”) and the federal Video Privacy Protection Act (“VPPA”) by improperly harvesting users’ personal data. U.S. District Court Judge John Lee of the Northern District of Illinois also awarded approximately $29 million in fees to class counsel.

The class action claimants alleged that TikTok violated BIPA by collecting users’ faceprints without their consent and violated the VPPA by disclosing personally identifiable information about the videos people watched. The settlement agreement also provides for several forms of injunctive relief, including:

  • Refraining from collecting and storing biometric information, collecting geolocation data and collecting information from users’ clipboards, unless this is expressly disclosed in TikTok’s privacy policy and done in accordance with all applicable laws;
  • Not transmitting or storing U.S. user data outside of the U.S., unless this is expressly disclosed in TikTok’s privacy policy and done in accordance with all applicable laws;
  • No longer pre-uploading U.S. user generated content, unless this is expressly disclosed in TikTok’s privacy policy and done in accordance with all applicable laws;
  • Deleting all pre-uploaded user generated content from users who did not save or post the content; and
  • Training all employees and contractors on compliance with data privacy laws and company procedures.
Copyright © 2022, Hunton Andrews Kurth LLP. All Rights Reserved.

National Security Meets Teenage Dance Battles: Trump Issues Executive Orders Impacting TikTok and WeChat Business in the U.S.

On August 6, 2020, Trump issued two separate executive orders that will severely restrict TikTok and WeChat’s business in the United States.  For weeks, the media has reported on Trump’s desire to “ban” TikTok with speculation about the legal authority to do so.  We break down the impact of the Orders below.

The White House has been threatening for weeks to ban both apps in the interest of protecting “the national security, foreign policy, and economy of the United States.”  According to the Orders issued Thursday, the data collection practices of both entities purportedly “threaten[] to allow the Chinese Communist Party access to Americans’ personal and proprietary information — potentially allowing China to track the locations of Federal employees and contractors, build dossiers of personal information for blackmail, and conduct corporate espionage.”

This is not a new threat.  A variety of government actions in recent years have been aimed at mitigating the national security risks associated with foreign adversaries stealing sensitive data of U.S. persons.  For example, in 2018, the Foreign Investment Risk Review Modernization Act (FIRRMA) was implemented to expand the authority of the Committee on Foreign Investment in the United States (CFIUS) to review and address national security concerns arising from foreign investment in U.S. companies, particularly where foreign parties can access the personal data of U.S. citizens.  And CFIUS has not been hesitant about exercising this authority.  Last year, CFIUS required the divestment of a Chinese investor’s stake in Grindr, the popular gay dating app, because of concerns that the Chinese investor would have access to U.S. citizens’ sensitive information which could be used for blackmail or other nefarious purposes.  That action was in the face of Grindr’s impending IPO.

In May 2019, Trump took one step further, issuing Executive Order 13873 to address a “national emergency with respect to the information and communications technology and services supply chain.”  That Order stated that foreign adversaries were taking advantage of vulnerabilities in American IT and communications services supply chain and described broad measures to address that threat.  According to these new Orders, further action is necessary to address these threats.  EO 13873 and the TikTok and WeChat Orders were all issued under the International Emergency Economic Powers Act  (IEEPA), which provides the President broad authority to regulate transactions which threaten national security during a national emergency.

Order Highlights

Both Executive Orders provide the Secretary of Commerce broad authority to prohibit transactions involving the parent companies of TikTok and WeChat, with limitations on which transactions yet to be defined.

  • The TikTok EO prohibits “any transaction by any person, or with respect to any property, subject to the jurisdiction of the United States,” with ByteDance Ltd., TikTok’s parent company, “or its subsidiaries, in which any such company has any interest, as identified by the Secretary of Commerce”
  • The WeChat EO prohibits “any transaction that is related to WeChat by any person, or with respect to any property, subject to the jurisdiction of the United States, with Tencent Holdings Ltd., WeChat’s parent company “or any subsidiary of that entity, as identified by the Secretary of Commerce.”
  • Both Executive Orders will take effect 45 days after issuance of the order (September 20, 2020), by which time the Secretary of Commerce will have identified the transactions subject to the Orders.

Implications

Until the Secretary of Commerce identifies the scope of transactions prohibited by the Executive Orders, the ultimate ramifications of these Orders remain unclear.  However, given what we do know, we have some initial thoughts on how these new prohibitions may play out.  The following are some preliminary answers to the burning questions at the forefront of every American teenager’s (and business person’s) mind.

Q:  Do these Orders ban the use of TikTok or WeChat in the United States?

A:  While the Orders do not necessarily ban the use of TikTok or WeChat itself, the app (or any future software updates) may no longer be available for download in the Google or Apple app stores in the U.S., and U.S. companies may not be able to purchase advertising on the social media platform – effectively (if not explicitly) banning the apps from the United States.

Q:  Will all transactions with ByteDance Ltd. and Tencent Holdings Ltd. (TikTok and WeChat’s parent companies, respectively) be prohibited?

A:  Given the broad language in the Orders, it does appear that U.S. app stores, carriers, or internet service providers (ISPs) will likely not be able to continue carrying the services while TikTok and WeChat are owned by these Chinese entities.  However, it is unlikely that the goal is to prohibit all transactions with these companies as a deterrent or punishment tool – which would essentially amount to designating them as Specially Designated Nationals (SDNs) – the  Orders clearly contemplate some limitations to be imposed on the types of transactions subject to the Order by the Secretary of Commerce.  Furthermore, the national security policy rationale for such restrictions will not be present in all transactions (i.e. if the concern is the ability of Chinese entities to access personal data of U.S. citizens in a manner that could be used against the interests of the United States, then presumably transactions in which ByteDance Ltd. and Tencent Holdings Ltd. do not have access to such data should be permissible.).  So while we do not know exactly what the scope of prohibited transactions will be, it would appear that the goal is to restrict these entities’ access to U.S. data and any transactions that would facilitate or allow such access.

Q:  What does “any property, subject to the jurisdiction of the United States” mean?

A:  Normally, the idea behind such language is to limit the prohibited transactions to those with a clear nexus to the United States: any U.S. person or person within the United States, or involving property within the United States.  It is unlikely that transactions conducted wholly outside the United States by non-U.S. entities would be impacted.  From a policy perspective, it would make sense that the prohibitions be limited to transactions that would facilitate these Chinese entities getting access to U.S.-person data through the use of TikTok and WeChat.

Q:  What about the reported sale of TikTok?

A: There is a chance the restrictions outlined in the TikTok EO will become moot.  Reportedly, Microsoft is in talks with ByteDance to acquire TikTok’s business in the United States and a few other jurisdictions.  If the scope of prohibited transactions are tailored to those involving access to U.S. person data and if a U.S. company can assure that U.S. user-data will be protected, then the national security concerns of continued use of the app would be mitigated.  Unless and until such acquisition takes place, U.S. companies investing in TikTok or utilizing it for advertising such be prepared for the restrictions to take effect.  At this time, there do not appear to be any U.S. buyers in the mix for WeChat.

Q:  The WeChat EO prohibits any transaction that is “related to” WeChat…what does that mean?

A:  The WeChat prohibition is more ambiguous and could have significantly wider impact on U.S. business interests. WeChat is widely used in the United States, particularly by people of Chinese descent, to carry out business transactions, including communicating with, and making mobile payments to, various service providers.  The WeChat EO prohibits “any transaction that is related to WeChat  with Tencent Holdings Ltd., or any of its subsidiaries.  Unlike TikTok, WeChat’s services extend beyond social media.  While the language of the ban is vague and the prohibited transactions are yet to be determined, it appears likely that using WeChat for these communications and transactions may no longer be legal. It is also unclear if the WeChat prohibition will extend to other businesses tied to Tencent, WeChat’s parent company, including major gaming companies Epic Games (publisher of the popular “Fortnite”), Riot Games (“League of Legends”), and Activision Blizzard, all in which Tencent has substantial ownership interests.  There has been some reporting that a White House official confirmed Tencent’s gaming interest are excluded from the Order as being unrelated to WeChat, but until the Secretary of Commerce specifies the prohibited transactions, the scope of the Order remains uncertain

Bottom Line

Until the Secretary of Commerce issues its list of transactions prohibited under these Executive Orders, the scope and effect of these Orders is conjectural.  This Administration’s all-in posture towards China would suggest that the prohibitions could be broad and severe.  U.S. companies utilizing WeChat or TikTok for business purposes or conducting business with the apps’ owners, should think carefully about ongoing and future transactions.  Of course, there is an election right around the corner and a new Administration may bring significant change to related foreign, trade and technology policy.  Thoughtful planning for a variety of scenarios will enable companies’ to respond appropriately as the restrictions on TikTok and WeChat are crystallized.


Copyright © 2020, Sheppard Mullin Richter & Hampton LLP.