The Utah Social Media Regulation Act is “an unconstitutional attempt to regulate both minors’ and adults’ access to -- and ability to engage in -- protected expression,” said NetChoice Monday (docket 2:23-cv-00911) in U.S. District Court for Utah in Salt Lake City, seeking to block the state from enforcing the law when it takes effect March 1.
Section 230
The 5th U.S. Circuit Court of Appeals, by an 8-7 vote, denied plaintiff-appellant John Doe’s motion for rehearing en banc of the court’s affirmation of the district court’s denial of his Section 230 claims against Snap, said its order Monday (docket 22-20543). The case involved Doe, who was sexually assaulted by his high school teacher when he was 15 years old. His teacher, Bonnie Guess-Mazock, who pleaded guilty to sexual assault, used Snapchat to send him sexually explicit material. Doe sought to hold Snap accountable for its alleged encouragement of that abuse. Bound by the 5th Circuit’s “atextual interpretation” of Section 230, the U.S. District Court for Southern Texas and a panel of this court “rejected his claims at the motion to dismiss stage,” wrote the seven dissenting judges. The en banc court, by a margin of one, voted against revisiting our “erroneous interpretation” of Section 230, “leaving in place sweeping immunity for social media companies that the text cannot possibly bear,” they said. That “expansive immunity” is the result of adopting the too-common practice of reading extra immunity into statues where it doesn’t belong, and relying on policy and purpose arguments to grant sweeping protection to internet platforms, they said. “Declining to reconsider this atextual immunity was a mistake,” they said. Section 230 “closes off one avenue of liability” for social media companies by preventing courts from treating platforms as the “publishers or speakers” of third-party content, they said. “In fact, Section 502 of the Communications Decency Act expressly authorizes distributor liability for knowingly displaying obscene material to minors,” they said. “It strains credulity to imagine that Congress would simultaneously impose distributor liability on platforms in one context, and in the same statute immunize them from that very liability,” they said. “That our interpretation of Section 230 is unmoored from the text is reason enough to reconsider it,” said the dissenting judges. “But it is unmoored also from the background legal principles against which it was enacted,” they said. Doe urged the 5th Circuit “to treat Snap as a distributor and not as a publisher,” they said. Doe states, correctly, that Section 230 was enacted to provide immunity for creators and publishers of information, not distributors, they said. The Communications Decency Act itself authorizes liability for platforms as distributors, they said. But “our overbroad reading of Section 230 renders Doe’s claim dead in the water,” they said.
The court should dismiss a privacy complaint against X, formerly Twitter, because the plaintiff has no viable claim the social media platform violated the Illinois Biometric Information Privacy Act (BIPA), said X's Nov. 20 reply (docket 1:23-cv-05449 ) in support of its motion to dismiss. Plaintiff Mark Martell’s August complaint alleged X implemented software to police pornographic and other “not-safe-for-work” images uploaded to Twitter without adequately informing individuals who interacted with the platform “that it collects and/or stores their biometric identifiers in every photograph containing a face that is uploaded to Twitter" (see 2308160021). In his opposition to X’s motion to dismiss, Martell failed to explain how PhotoDNA collects scans of facial geometry when the PhotoDNA webpage “directly refutes that allegation,” said the reply. Martell concedes that PhotoDNA doesn’t enable X to identify individuals depicted in images uploaded to Twitter, but he argued that biometric identifiers don’t have to be capable of identifying an individual, which X called an “oxymoron.” A biometric identifier must be capable of identifying an individual by definition, X said. Because BIPA applies only to data that can be used to identify an individual, and because Martell “fails to allege that PhotoDNA enables X to identify anyone,” his claims should be dismissed, it said. The plaintiff also fails to explain why Section 230(c)(2)(A) of the Communications Decency Act doesn’t immunize X from BIPA liability, said the reply. Though he said he didn’t seek to post obscene images on X and is not suing X for its actions as a publisher or speaker, “both arguments are irrelevant,” said X's reply, saying all that matters under the CDA is whether X’s alleged conduct is an “action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.” That is what X does with PhotoDNA: It uses it to “identify and remove known images of child exploitation,” the reply said. If PhotoDNA can’t be used to identify individuals depicted in images, “how can X possibly seek those individuals’ consent” under BIPA, it said. If the court determines that Martell’s understanding of BIPA is correct, and that Section 230 doesn’t immunize X from liability, then X will have an “impossible choice”: “Subject itself to recurring and outsized liability under BIPA by continuing to use PhotoDNA to remove child-exploitation images, or halt that use and allow child-exploitation images to circulate on its platform,” it said.
Here are Communications Litigation Today's top stories from last week, in case you missed them. Each can be found by searching on its title or by clicking on the hyperlinked reference number.
A California state appeals court ruling in Liapes v. Facebook (docket A164880) would make it unlawful to direct online information and advertisements to the people most likely to be interested in them, said NetChoice in an amicus letter Monday to the California Supreme Court, urging it to review the case.
Google, like almost every email service provider, “uses sophisticated filtering technology to protect users of its free Gmail service from unwanted and dangerous spam emails,” said Google’s memorandum of points and authorities Thursday (docket 2:22-cv-01904) in U.S. District Court for Eastern California in Sacramento in support of its motion to dismiss the Republican National Committee’s Oct. 10 first amended complaint (see 2310120002).
A day after U.S. District Judge Yvonne Gonzalez Rogers for Northern California denied social media defendants’ motion to dismiss a negligence lawsuit against them for their alleged role in fueling a youth mental health crisis in the U.S., Facebook and Instagram parent Meta blogged in favor of federal legislation “to create simple, efficient ways for parents to oversee their teens’ online experiences.”
The Children’s Online Privacy Protection Act (COPPA) claims in California et al. vs. Meta, brought in U.S. District Court for Northern California in Oakland last month by 33 attorneys general (see 2310250066), are “self-defeating,” said attorney Cathy Gellis on a Chamber of Progress webinar Thursday examining social media addiction lawsuits pending before state and federal courts.
An Oct. 18 COVID-19 diagnosis forced the office of California Attorney General Rob Bonta (D) to ask the 9th U.S. Circuit Court of Appeals late Wednesday to extend until Dec. 13 the deadline for Bonta’s opening brief in his appeal of the district court's Sept. 18 decision granting NetChoice’s motion for a preliminary injunction to block him from enforcing the state’s Age Appropriate Design Code (see 2310190030), said the office’s unopposed motion (docket 23-2969). Bonta’s lead attorney, Deputy AG Elizabeth Watson, was diagnosed with COVID-19 and “continues to experience symptoms,” said the motion. Watson has returned to work, “and will be working to meet and prepare for various deadlines in other matters,” many of which her illness also affected, during the briefing period, it said. The briefing in the law's appeal “will cover novel legal issues that deserve careful consideration and diligent research, including the appropriate standard of review for laws regulating the collection and use of data,” said the motion. Other attorneys will need to review the briefing before it’s “finalized and filed,” it said. Any “lesser” deadline extension than the 28 days requested “would result in briefing being due on or around the Thanksgiving holiday,” said the motion. Counsel for NetChoice consents to the requested extension with the “understanding” that Bonta’s office won’t oppose a similar extension for NetChoice in the future, it said. Bonta’s office also agreed not to seek a stay of the preliminary injunction order currently in place, said the motion. Under the agreed-on proposed revised schedule, NetChoice’s answering brief would be due Feb. 7, and Bonta’s reply brief would be due March 13, it said. In granting the preliminary injunction, the U.S. District Court for Northern California held that NetChoice was likely to succeed on the merits of its argument that the law violates the First Amendment. The lower court also held that the Children’s Online Privacy Protection Act and the Communications Decency Act's Section 230 preempt it. Bonta filed his notice of appeal Oct. 18 (see 2310190030), the same day Watson's declaration says she was diagnosed with COVID-19.
The opening brief is due Nov. 15 in California Attorney General Rob Bonta’s (D) appeal of the district court's Sept. 18 decision granting NetChoice’s motion for a preliminary injunction to block him from enforcing AB-2273, the state’s Age Appropriate Design Code (see 2310190030), said a 9th U.S. Circuit Appeals Court clerk’s order Wednesday (docket 23-2969). NetChoice's answering brief is due Dec. 13, and Bonta's optional reply brief is due 21 days after service of the answering brief, said the order. The U.S. District Court for Northern California held that NetChoice was likely to succeed on the merits of its argument that AB-2273 violates the First Amendment. The lower court also held that AB-2273 is preempted by the Children’s Online Privacy Protection Act and Section 230 of the Communications Decency Act.