Meta, Snap, TikTok Could Face More Lawsuits Over Child Suicides, Harm
Meta, Snap and TikTok could face additional lawsuits from attorneys who sued the companies earlier this year over the suicide of an 11-year-old Connecticut girl and her social media addiction.
The platforms are “unreasonably dangerous” because they’re designed to hook children and subject them to predators and sexual exploitation, Social Media Victims Law Center founder and attorney Matthew Bergman told us. He filed three similar cases and is preparing additional complaints that deal with self-harm and suicide.
The Selena Rodriguez lawsuit is a “watershed” case, said Bergman, who expects a decision from the U.S. District Court in San Francisco in docket 3:22-cv-00401 in August. According to the complaint, Rodriguez was on social media at “all hours of the day and night” communicating with some 2,500 users, including conversations with adults who “bullied” the 11-year-old into trading sexually explicit images. She posted a six-second suicide video in July 2021 on Snapchat. The video showed her ingesting two of her mother’s Wellbutrin antidepressant pills. According to the complaint, she took the pills, “took a gulp of soda out of a bottle, looked into the camera, made the ‘Peace Out’ hand gesture and playfully stuck out her tongue.”
Berman and Selena’s mother, Tammy Rodriguez, are pursuing common law negligence and product liability claims against all three platforms. They believe the platforms are liable for allowing predators to communicate with Rodriguez in violation of the 2018 Stop Enabling Sex Traffickers-Allow States and Victims to Fight Online Sex Trafficking (SESTA-FOSTA) package (see 2202240065).
A Meta spokesperson highlighted mental health resources for parents and children on the company’s platforms. That includes parental controls on Instagram, like time limits for usage. The company cited tools for preventing suicide and self-harm, including crisis hotlines and decreased visibility of objectionable content. The platforms also have controls for removing "like" counts on posts, Meta said. The company declined to comment about ongoing litigation. Snap and TikTok didn’t comment.
Bergman said he started the firm in November as an “outgrowth” of the testimony Facebook whistleblower Frances Haugen delivered to the Senate Consumer Protection Subcommittee (see 2110050062). Senate Consumer Protection Subcommittee Chairman Richard Blumenthal, D-Conn., and ranking member Marsha Blackburn, R-Tenn., introduced the Kids Online Safety Act months after the hearing. The legislation is intended to empower children and their parents, Blackburn said. President Joe Biden unveiled a sweeping agenda to address the social media-linked children's mental health “crisis” during his State of the Union address in March (see 2203010072). Surgeon General Vivek Murthy issued an advisory on youth mental health in December.
Court documents show Rodriguez started using social media apps around the age of 9, and her activity increased to the point that she was getting only about two hours of sleep a night. Her mother confiscated devices, which caused Selena to self-harm or run away from home, the complaint said.
Rodriguez spent most of her time on Instagram, Snapchat and TikTok. Multiple absences from school led the Connecticut Department of Children and Families to investigate, and the department found her mother to be a “responsible and responsive parent,” said court documents. They show several exchanges between Selena and adult male users on various platforms, in which the adults asked for and received sexually explicit photos when Selena was as young as 10. She received targeted advertising on Instagram, in which a user said their company was looking for “ladies to be models and brand ambassadors for our brand” of underwear.
Another conversation showed one user offering Selena $600 a week to be his “sugar baby.” Selena said in the conversation that she’s 12, and “you’re like” 30. “You can get arrested,” she wrote. Selena sent sexually explicit images on Snapchat, which were “subsequently shared or leaked to her classmates.” She chatted with an 11-year-old male user in Philadelphia, in which she told him about her mental health struggles and hospitalizations. “Sorry just got home from Connecticut children’s hospital for trying to kms,” she wrote a month before the suicide. Her “death was the proximate result of psychic injury caused by her addictive use of Instagram, TikTok, and Snapchat social media products,” the complaint said.
Algorithms “affirmatively direct” children to dangerous and unwanted content resulting in “severe psychological harm and even death,” said Bergman. He has been in touch with “many, many” other families, some of the cases dealing with suicides. “You will see that the factual allegations in Rodriguez are horrific as they relate to what a fifth grade girl was subjected to online,” he said. “I anticipate this is going to be a robust litigation.”