AI-Driven Tools Enabling Social Media Connections Between Minors, Strangers: Suit
Social media platforms’ AI-driven user recommendation tools are facilitating and creating connections between minors and “complete strangers,” fueling an “unprecedented mental health crisis,” said a Sept. 12 complaint (docket 238-cv-04270) in California Superior Court for Los Angeles County.
The plaintiffs in the negligence action, represented by Social Media Victims Law Center, name as defendants social media platforms Meta, Snap, TikTok and Google for their “studied efforts to induce young people to compulsively use their products.” The companies have borrowed “heavily” from behavioral and neurobiological techniques used by slot machines and exploited by the cigarette industry, the complaint said. The minors, described as happy prior to the heavy social media use, became depressed and their personalities changed after spending countless hours using the platforms, alleges the complaint.
Defendants deliberately embedded in their products design features “aimed at maximizing youth engagement to drive advertising revenue,” said the complaint, saying the defendants “know children are in a developmental stage that leaves them particularly vulnerable to the addictive effects” of social media platform features, but they “target them anyway, in pursuit of additional profit."
Social media platforms have “rewired how our kids think, feel, and behave,” said the complaint, noting that growing scientific research has drawn a “direct line” between defendants’ “conscious, intentional design choices and the youth mental health crisis gripping our nation.” Defendants' products have “promoted disconnection, disassociation, and a legion of resulting mental and physical harms,” in multiple ways, it said.
Plaintiff M.S., on behalf of 12-year-old Louisiana resident A.D., asserts claims against Snap and TikTok, saying A.D. became “depressed and isolated,” struggles to sleep and began to “self-harm and engage in cutting,” for which he was hospitalized “numerous times,” due to interactions on social media.
Plaintiff Monica Ortiz, on behalf of Fidel Marin, who died in 2021 at age 16, alleged her son suffered from mental health harms inflicted on him in his use of Facebook, Instagram and YouTube. Fidel, of San Bernadino, California, “acted and felt like he could not live without Instagram, Facebook, and YouTube, and began spending every waking moment on those platforms,” said the complaint. The more time he spent on the platforms, “the less sleep he got and the more anxious and depressed he became.”
Plaintiff Kristina Cahak said her daughter, Morgan, of Cape Coral, Florida, downloaded Facebook and Instagram at 12 and Snapchat at 13 without her mother’s consent. After scrolling an increasing amount of time through “endless feeds” of photos of self-harm and other depressing content, she withdrew from friends, became sleep-deprived and began to cut herself. Morgan died by suicide at 15 in 2015.
Bryan Kroll, of Haskell, New Jersey, was targeted by AI-driven, feed-based tools on Meta with content containing “harmful social comparison” and other materials “urging him to commit acts of self-harm,” said the complaint. Meta deliberately directed “amounts of depressive, violent, and related materials to Bryan based on his age and gender.” Brian died by suicide in 2016 at 15, it said.
Plaintiff M.D. asserts claims against Snap on behalf of her daughter T.M., a 16-year-old who lives in Florida. T.M. was targeted with “excessive amounts of depressive, health, predatory, and related materials to T.M. based on her age and gender, as well as Snapchat’s Streaks, Score, Snap Map, and other gamifications and social metric features,” said the complaint. T.M. suffered harms including “dangerous dependency on their products, addiction, sleep deprivation, anxiety, depression, child sex abuse, self-harm, suicidal ideation, and attempted suicide,” it said.
Plaintiff M.M., on behalf of 17-year-old Missouri resident T.R., alleges her daughter was diagnosed with severe depressive order after developing a negative body image in connection with use of Instagram, Snapchat, YouTube and TikTok. She attempted suicide multiple times and was hospitalized as a result, it said.
Plaintiff April Tsosie, as successor-in-interest to Melissa Buckles, a Billings, Montana 17-year-old who died in 2018 due to harms Facebook, Instagram and Snapchat’s products inflicted on her, said Melissa “went from a happy and outgoing young woman to someone who was always depressed and anxious and would obsess over her weight and appearance” as a result of social media use. Melissa attempted suicide for the first time at age 15, said the complaint.
Plaintiff J.S. asserts claims that Instagram and TikTok caused her daughter J.U., a 17-year-old Oregon resident, to have eating disorders and experience depression and anxiety caused by “seemingly random connections between her and complete strangers” on the platforms. Defendants’ knowing and deliberate product design, marketing, and operational conduct caused “serious emotional, mental, and physical harms" to J.U. including “dangerous dependency” on their products, sleep deprivation, anxiety, suicidal ideation and an eating disorder.
Plaintiff T.M., on behalf of 17-year-old Florida resident A.M., asserts claims against Instagram, Snapchat and TikTok for “deliberate direction of excessive amounts of depressive, health, and related materials to A.M. based on her age and gender,” plus Snapchat’s gamifications and social metric features that caused her depression and emotional and physical harms.
Plaintiff V.M., on behalf of 17-year-old Texas resident, J.C., alleges excessive use of Instagram, Snapchat and TikTok led J.C. to “lose control and act out in uncharacteristic ways,” as she became more depressed and anxious as a result of long hours using social media platforms.
Plaintiff Krislyn Wells, of Austin, Texas, asserts claims against Meta and Snap on behalf of her son Chandler, who died in 2021 at 18, after suffering mental health harms arising from excessive social media use. Chandler “stopped being able to sleep at night and became anxious and depressed." He also started "obsessing over his appearance and would go through periods of starving himself and binge eating,” the complaint said.
Plaintiff Ivy Cavazos alleges Instagram, Snapchat and TikTok led her 19-year-old daughter, Texas resident T.R., to “self-harm” as a result of a “dangerous dependency on their products.”
Florida Plaintiff Destiny Bryant, asserting claims against Facebook, Instagram, Snapchat and TikTok, alleges she lost interest in other activities after isolating herself on social media where “AI driven” tools targeted her with harmful social comparisons, plus “depressive and suicidal, disordered eating, and other harmful materials.” Destiny's mother, unaware of the social media use at first, didn't consent to Destiny having a Facebook account when she got an account at 12, said the complaint.
Plaintiffs plead all causes of action in the complaint “in the broadest sense, pursuant to all laws that may apply under choice-of-law principles,” including the law of the resident states of plaintiffs, said the complaint. To the extent applicable to specific causes of action, plaintiffs plead these causes of action under all applicable product liability acts, statutes, and laws of their respective states, it said. In addition to several negligence claims, claims include strict liability for design defect and failure to warn, sex and age discrimination, wrongful death and survival action.
Plaintiffs seek past, present and future damages; loss of earnings and impaired earning capacity; medical and funeral expenses; punitive or exemplary damages; damages available for wrongful death and survival; exemplary and punitive damages; and attorneys’ fees and costs.