DOJ Urges SCOTUS to Hear Challenges to Texas, Fla. Social Media Laws
The U.S. Supreme Court should grant the cert petitions of NetChoice and the Computer & Communications Industry Association challenging the restrictions in the Florida and Texas social media platforms’ content moderation laws on First Amendment grounds, said Solicitor General Elizabeth Prelogar in an amicus brief Monday (dockets 22-277 and 22-555).
Prelogar urged the court to deny NetChoice’s cross-petition (docket 22-393) challenging the laws’ general-disclosure and viewpoint-discrimination provisions because those challenges don’t warrant review and NetChoice was unlikely to succeed on the merits. SCOTUS invited the solicitor general’s brief in January (see 230123005). Court observers we polled then said SCOTUS wouldn’t have sought DOJ’s input unless it planned to grant cert.
The court should grant cert to decide whether the content-moderation and individualized-explanation requirements of the Florida and Texas statutes comply with the First Amendment, said the brief. The questions raised in the petitions are “undeniably important,” and all parties agree they warrant SCOTUS review, it said.
The platforms’ content-moderation activities are protected by the First Amendment, and the content-moderation and individualized-explanation requirements in the Florida and Texas laws “impermissibly burden those protected activities,” said the brief. The decisions in the 5th and 11th circuits “create a square and acknowledged circuit split” on the important First Amendment question about states’ authority “to restrict a business’s ability to select, edit, and arrange the third-party content that appears on its social-media platform,” it said.
Even before that conflict emerged, SCOTUS recognized that the question presented “would likely warrant review” by vacating the 5th Circuit’s stay of the preliminary injunction in the Texas case, said the brief. In DOJ’s view, the court should grant review in both the Florida and Texas cases.
Though the cases turn “on the same fundamental question” about the First Amendment status of the platforms’ content-moderation activities, the statutes “target different types of content moderation and impose different obligations, said the brief. Those differences “ultimately may not be material” to the court’s First Amendment analysis, but considering the two laws together would give SCOTUS “the fullest opportunity to address the relevant issues,” it said.
When a social-media platform “selects, edits, and arranges third-party speech for presentation to the public, it engages in activity protected by the First Amendment,” said the brief. “That activity, and the platforms’ business practices more generally, are not immune from regulation,” it said. But here, the states haven’t “articulated interests that justify the burdens imposed by the content-moderation restrictions under any potentially applicable form of First Amendment scrutiny,” it said.
Like publishers, parade organizers and cable operators, the companies that run the major social-media platforms are in the business of delivering curated compilations of speech created by others, said the brief. “Given the torrent of content created on the platforms, one of their central functions is to make choices about which content will be displayed to which users, in which form and which order,” it said. The act of “culling and curating” the content that users see is “inherently expressive,” even if the speech that’s collected “is almost wholly provided by users,” it said.
Especially because the platforms’ only products are “displays of expressive content,” a government requirement that they display different content “plainly implicates the First Amendment,” said the brief. In arguing otherwise, the states “asserted that the platforms’ content-moderation activities are unprotected conduct,” it said.
The states also emphasized that while the major platforms prioritize and arrange all the content that appears on their sites, they don’t “moderate most of it,” said the brief. Nor do the platforms, according to the states, “endorse the messages expressed by users,” plus they're shielded from liability for third-party content under Section 230, it said.
But what makes the platforms’ content-moderation choices expressive isn’t that the platforms “adopt as their own or assume legal responsibility for each individual piece of content posted by users,” said the brief. It’s that they choose “whether and how to present that content by selecting, curating, and arranging it,” it said.
NetChoice and CCIA hailed the solicitor general’s brief in separate statements Monday. The brief “underscores” that the Florida and Texas statutes are unconstitutional and that SCOTUS “should review our cases,” Chris Marchese said, NetChoice director-litigation. NetChoice urges the court to “strike down” the laws “and reaffirm that the Constitution prohibits the government from controlling online speech,” said Marchese.
CCIA “is glad to see” the solicitor general “confirm the importance of the First Amendment issues raised by these state laws,” President Matt Schruers said. “This is exactly the sort of case we would expect the Supreme Court to take up,” because it involves a key constitutional issue and split appellate court decisions, he said.