The court summarizes:
The plaintiffs commenced this action in connection with the death by suicide of 16 year old Chase Nasca on February 18, 2022 after he walked in front of a train. Plaintiffs’ complaint alleges twelve causes of action sounding in design defect, strict products liability, general negligence, negligent design, wrongful death, conscious pain and suffering, violation of the consumer protection laws of New York, unjust enrichment, invasion of privacy, intentional infliction of emotional distress, joint and several liability, and loss of services.
For more on Chase’s tragedy, see the People magazine story or the Social Media Victims Law Center’s press release about the lawsuit.
This case was originally transferred into the Northern District of California MDL, but the plaintiffs exited the MDL and pursued their own standalone action. I infer that the procedural goal was to connect the TikTok claims to the related action against the railroad agency in New York, but I’m not sure.
TikTok defended on Section 230 grounds. The court responds:
While plaintiffs complaint purportedly “seek[s] to hold the TikTok Defendants accountable for TikTok’s own operations, conduct, and product — not for the speech of others or for their content moderation decisions,” plaintiffs specifically state that TikTok directed Chase to videos suggesting that young people should end their lives by stepping in front of a moving train and shortly after receiving one such video, Chase accessed tracks for the Long Island Railroad near his home and died. Thus, the crux of plaintiffs complaint is that TikTok guided Chase to third party video content that promoted suicide or self-harm ultimately resulting in his death. The Court concludes that plaintiffs complaint and each of their causes of action are fundamentally based upon the way TikTok publishes its materials to users by use of its algorithms. While plaintiffs attempt to characterize TikTok’s underlying actions as “design, engineering, and programming decisions,” they admit that Chase died because TikTok failed to adequately monitor and remove suicidal content. Accordingly, plaintiffs seek to hold defendants liable for its exercise of a publisher’s traditional editorial functions.
Among other points, the plaintiffs argued that TikTok should have age authenticated all users. (This is a standard argument in minors’ lawsuits against social media). A reminder that such a legal principle would synthetically create a common law segregate-and-suppress mandate, which would not be a positive development for children (or adults).
The plaintiffs are surely going to appeal this ruling.
Case Citation: Nasca v. Bytedance Ltd., 2025 N.Y. Misc. LEXIS 2255 (N.Y. Supreme Ct. April 14, 2025). The CourtListener page with details about the time the case was in federal court before it got remanded to state court.
* * *
Some additional recent Section 230 cases:
Mann v. Meta Platforms Inc., 2025 U.S. Dist. LEXIS 70065 (N.D. Cal. April 11, 2025)
This is a pro se/IFP case. The court summarizes the complaint:
Mann alleges that, while scrolling on the Facebook App, he “was sent multiple invites to join a Chrystal Meth Club, and a Chrystal Meth Whore Club,” which depicted pictures “of meth pipes and actual methamphetamines bags,” and also included “[e]xtremely disturbing” imagery suggesting “illegal business” having to do with a “shady looking drug dealer/ pimp type figure hiding in the background[.]” [Docket No. 1 (“Compl.”) ¶ 6; see also id. at 5 (“[F]acebook then bagan [sic] sending me invites for meth whore clubs and Chrystal meth clubs.”).] Mann, “a recovered drug addict,” alleges that this “unlawful, immoral, and unethical behavior” is contrary to Meta’s terms of service because Meta “stat[ed] that their product is ‘not to be used for any unlawful purposes, or assist someone else in usinng [sic] their products in such a way, or at the expense of the safety and well being of others or the integrity of the community.’”
The court says the claims “may well be barred” by Section 230. As a courtesy, the court gives the plaintiffs another chance to plead his case around Section 230.
ICS Provider. “courts regularly find that Meta and Facebook are providers of interactive computer services.”
Third-Party Content. Mann “appears to allege that the meth-related information for which he is trying to hold Meta liable was created by other Facebook users even if it was “sent” to him by Meta/Facebook or otherwise displayed “on [his] personal feed.””
Publisher/Speaker Claim.
“Mann alleges that Meta “stat[ed] that their product is ‘not to be used for any unlawful purposes, or assist someone else in usinng [sic] their products in such a way, or at the expense of the safety and well being of others or the integrity of the community.’” Even assuming this language came from Meta’s terms of service, it is not a “promise” by Meta to take any particular action. Indeed, the language that Facebook “is not to be used for any unlawful purposes” suggests that these terms apply to the Facebook users who posted the offending content and was not a promise made to Mann at all. Unless Mann can point to specific promises in the terms of service or elsewhere, establishing a contractual or legal duty, Meta’s failure to remove meth-related third-party content arises from Meta’s status as a publisher and would require Meta to monitor (and more effectively remove) such content. This theory of liability is barred by section 230.
This analysis highlights a problem with the YOLO decision, which gave a Section 230 workaround to TOS statements that were more like puffery than promises.
Rapaport v. Iyer, 2025 WL 966275 (S.D.N.Y. March 31, 2025). The CourtListener page.
The plaintiff was a law student at NYU and a summer associate at Kirkland & Ellis when he claims he got canceled for his (conservative) views. Rapaport benefited from a letter of recommendation from the well-known law professor Richard Epstein. A reminder that Prof. Epstein problematically strayed outside his swimlane to predict in March 2020 that COVID would only kill 500 Americans (a number he quickly revised to 5,000). Fact check: COVID has killed over 1.2M Americans.
As a thank you to Prof. Epstein for his support, Rapaport sued Epstein for fraud and tortious interference. Above the Law coverage of the lawsuit. Rapaport complained about Epstein forwarding a disparaging email about him, and Epstein invoked Section 230. In general, Section 230 can apply to email forwarding, a topic I’ve covered repeatedly on the blog. However, the court says Section 230 does not apply when “Plaintiff alleges Epstein ‘knowingly lied’ about the identities of Student Defendants when asked, and that he ‘repeated his disparagement at every possible opportunity to anyone who would listen.’ Because these allegations embrace and depend on conduct beyond just Epstein’s emails, Section 230 would not prevent liability.”
Amin v. O’Brien, 2025 WL 1042719 (S.D. Ga. April 8, 2025)
A defamation lawsuit over a true crime podcast hosted by Wondery. “To show that Amazon and Wondery are responsible for the content of the podcast, at least in part, Plaintiff argues that Amazon and Wondery “offer services included [sic] ‘develop[ment] and grow[th]’ to podcast creators.” However, Plaintiff’s complaint does not contain such an allegation. This is problematic for Plaintiff as he attempts to ward off Amazon and Wondery’s CDA immunity defense.”