"Addictive" social media user experience (UX) features on trial
on February 03, 2026
By using our sites, you agree to Our Privacy Policy and performance cookies.
on February 03, 2026
A trial is taking place in the Los Angeles County Superior Court, where jury selection started on January 27. It is testing a new legal theory intended to start greater regulation of social media platforms like TikTok, Snap, YouTube, and Meta’s Facebook and Instagram: Lawyers are gearing up to argue that the companies behind these platforms are designing their sites to be deliberately addictive, resulting in direct personal injury to users, especially children.
The trial is expected to consist of nine cases, which have been compiled by judges across the nation as some of the strongest bellwethers for this new argument. First on the docket is a case brought by a 20-year-old plaintiff identified as K.G.M., who says that a lack of sufficient guardrails on social media sites during her youth led to compulsive use and mental health concerns such as depression, anxiety, body dysmorphia, self-harm, and risk of suicide.
The defendants named in K.G.M.’s initial suit were Bytedance, the last majority owner of TikTok; Snap, which owns Snapchat; Google, the owner of YouTube; and Meta. However, both Snap and TikTok settled the suit in the days leading up to jury selection for undisclosed sums, leaving just Meta and Google.
At the heart of all of these suits lies a design-based claim: These tech companies are using intentionally engineered tricks to foster addictive behaviors among young users. Court documents point out several specific user experience (UX) choices as evidence of this pattern. Here are a few of the key examples in question.
The results of these initial decisions are expected to be test case for a second set of federal cases, scheduled for trial this summer, wherein several school districts, states, and attorneys general plan to argue that social media is a public nuisance and addictive to children.
ENDLESS SCROLL
Endless or Infinite scroll is a web design technique where new content automatically loads at the bottom of a page as a user scrolls down, creating a seamless, endless stream of information instead of separate pages (pagination). Popular on social media (like X, Instagram) and news sites, it keeps users engaged by removing the need to click "next page," but it can also lead to excessive use and information overload, notes
How it works
Dynamic Loading - When you reach the end of the current content, the website's server dynamically fetches and adds more content below it.
No Pagination - It eliminates the numbered pages (1, 2, 3...) used in traditional web design, creating a continuous flow.
Hybrid Options - Some sites use a "Load More" button as a middle ground, giving users a choice to continue.
In a court documents before Bytedance’s settlement, K.G.M. testified that TikTok’s endless scroll feature disrupted her sleep and caused her to become addicted to the app. According to confidential internal messages obtained by NPR back in October, TikTok is aware of the addictive nature of its central endless scroll “Explore” page, and even calculated the number of videos required to become hooked to the app to be 260.
EPHEMERAL CONTENT
Ephemeral content is digital media, such as videos or photos on social media, that is available for a short, limited time—typically 24 hours—before disappearing. Popularized by platforms like Snapchat and Instagram Stories, it focuses on raw, authentic, and behind-the-scenes moments, creating a "fear of missing out" (FOMO) that drives high, immediate engagement.
Usage in Marketing
Brands use ephemeral content for limited-time offers, behind-the-scenes glimpses,, and user-generated content to connect with younger, fast-paced audiences, such as Gen Z.
ALGORITHMIC RECOMMENDATIONS
Algorithmic recommendations are automated, data-driven systems that filter and suggest content, products, or services tailored to individual user preferences and behaviors. Using machine learning and big data, these systems—prevalent on platforms like Netflix, YouTube, and Amazon—analyze past interactions (clicks, watch time, purchases) to predict future engagement.
Types of Systems
Collaborative Filtering: Recommends items based on the preferences of similar users.
Content-Based Filtering: Suggests items similar to those a user liked in the past, focusing on item attributes.
Hybrid Systems: Combines both, as used by Netflix to suggest content based on both behavior and content similarity.
These systems are crucial for managing information overload, helping users discover new, relevant items, but they also raise questions regarding accountability and the, at times, detrimental, effect of focusing purely on "engagement".
In re Soc. Media Adolescent Addiction/Pers. Inj. Prods. Liab. Litig., 702 F. Supp. 3d 809 (N.D. Cal. 2023) and In re Soc. Media Adolescent Addiction/Pers. Inj. Prods. Liab. Litig., 754 F. Supp. 3d 946 (N.D. Cal. 2024), appeal dismissed sub nom. Fla. Off. of Att'y Gen. v. Meta Platforms, Inc., No. 24-7019, 2024 WL 5443167 (9th Cir. Dec. 16, 2024), and motion to certify appeal denied, No. 4:22-MD-3047-YGR, 2025 WL 1182578 (N.D. Cal. Mar. 11, 2025)
In In re Social Media Adolescent Addiction/Personal Injury Products Liability Litigation, 754 F. Supp. 3d 946 (N.D. Cal. 2024), the court addressed complex product liability claims against social media companies (Meta, TikTok, Snap, Google) regarding addictive design features harming minors. The court generally rejected an "all or nothing" approach, allowing some negligent design claims to proceed while dismissing others based on Section 230 immunity.
Background
Individuals, school districts, and states’ Attorneys General, on behalf of children and adolescents, filed actions against operators of social media platforms alleging products liability, strict liability, negligence, and negligence per se claims under state law. After cases were transferred to multi-district litigation (MDL), defendants moved to dismiss for failure to state a claim.
The District Court, Yvonne Gonzalez Rogers, held that:
1 design defect product liability claims not related to publication of third-party content were not barred by Communications Decency Act (CDA);
2 design defect product liability claims related to publication of third-party content were barred by CDA;
3 strict liability, failure to warn, and negligence claims were not barred by CDA;
4 negligence per se claims were not barred by CDA;
5 design of platforms of creating timing and clustering of notifications of creation of platforms' own content was entitled to First Amendment protection;
6 plaintiffs sufficiently alleged that purported defects could be classified as “products,” for purposes of strict products liability claims under New York law and product-based negligence claims under Georgia law, as predicted by the District Court; and
7 plaintiffs sufficiently alleged causation elements of strict products liability claims under New York law and product-based negligence claims under Georgia law, as predicted by the District Court.
Motions granted in part and denied in part.
Products Liability - Defenses in general - Products Liability
Computers and software
Telecommunications
Communications Act of 1934 § 230, 47 U.S.C.A. § 230(c)(1).
Privilege or immunity
Design defect product liability claims by individuals, school districts, and states’ Attorneys General, on behalf of children and adolescents, against operators of social media platforms, related to operators’ use of algorithms to determine whether, when, and to whom to publish third-party content, were barred by Communications Decency Act (CDA); although plaintiffs alleged that algorithms were also crafted to increase quantity of users’ interaction with platform, regardless of content, decisions determining whether, when, and to whom to publish content, whether done by algorithm or editor, were traditional editorial functions essential to publishing, and plaintiffs did not identify any means by which operators could fix alleged defect other than altering when, and to whom, they publish third-party content.
Negligence
A duty to protect users from third party harm is recognized where the actor has created a risk of harm to another or permitted the risk of such harm to increase.
School districts, local government entities, and states’ Attorneys General, on behalf of children and adolescents, filed actions against social media companies alleging claims of product liability, negligence per se, negligence, and public nuisance under the laws of 19 states. Actions were consolidated into multi-district litigation (MDL). The United States District Court for the Northern District of California, 702 F.Supp.3d 809, granted in part and denied in part defendants' motion to dismiss product liability and negligence per se claims. Social media companies moved to dismiss the complaint of school districts and local government entities alleging negligence and public nuisance under state law.
In 2024 and 2025
The District Court, Yvonne Gonzalez Rogers, J., held that:
1 derivative injury rule did not bar negligence and public nuisance claims;
2 districts and entities sufficiently alleged proximate cause, so as to state claims for negligence under laws of 19 states;
3 districts and entities could not recover against companies or injuries stemming from non-foreseeable third-party conduct;
4 districts and entities sufficiently alleged a legal duty based on foreseeability, so as to state claims in negligence under laws of 19 states;
5 public policy supported imposition of duty of care on social media companies;
6 imposition of liability on social media companies did not offend First Amendment; and
7 under Alaska law, economic loss doctrine did not bar districts' and entities' negligence claims.
Motion granted in part and denied in part.