Consumer Electronics Daily was a Warren News publication.
SCOTUS Review Looms

Section 230's Broad Liability Shield Seen as Living on Borrowed Time

The Supreme Court will almost undoubtedly recast or cut back the broad immunity that interactive online platforms enjoy via the Communications Decency Act's Section 230 liability shield, but the dearth of high court precedent on Section 230 makes it unclear how the justices will change that legal status quo, legal experts told us. The court granted cert last week to two related cases on social media platforms' legal protection when they are used in conjunction with terror attacks (see 2210030036).

"The court almost certainly wouldn't have taken the case unless they were prepared to make some changes," said Mark MacCarthy, Brookings Institution governance studies nonresident senior fellow-Center for Technology Innovation. That there was no circuit court split points to it being "purely an act of will for them to take up the case," he said.

The Supreme Court could decide whether Section 230 has been interpreted too broadly to allow editorial discretion through algorithmic recommendations, said Competitive Enterprise Institute Center for Technology & Innovation Director Jessica Melugin. “There isn’t a lot of jurisprudence on this," she said, but the Supreme Court has "the freedom to curtail Section 230’s interpretation" and say algorithmic recommendations don't warrant liability protection.

But as justices look into this issue and see how central algorithmic recommendations are to social media business models, it will be more difficult to offer a “practical ruling” limiting platforms’ ability to recommend, said Melugin, which "would drastically transform people’s experience with social media," she said.

SCOTUS justices opting to take up the case either "want to make a mark on the wall" or saw circuit courts leaving platforms too broad a shield, said intellectual property lawyer Erik Olson of Farella Braun. The Gonzalez petition details platforms' recommendation-centric revenue model -- keep users online longer by algorithms offering content they seemingly would like, based on past content consumption habits.

The general vibe is that if you’re happy with the status quo on Section 230, it’s probably not a good thing that the Supreme Court took these cases,” said Charles Koch Institute Senior Research Fellow Neil Chilson, former FTC acting chief technologist. “But there’s a lot of complicated nuance in these cases.” The Supreme Court has never directly interpreted the text of the statute, so it's a good opportunity, but these aren’t the ideal cases for a conservative seeking to alter how platforms moderate, he said.

The lawsuits against social media moderation laws in Florida and Texas (see 2209290047) are a better fit for shaping social media content moderation, he said. The Supreme Court cases are great "if you’re a plaintiffs’ lawyer who wants to sue Big Tech companies with sympathetic facts,” he said, noting the “terrible tragedies” in both cases.

Section 230 'May Be Sort of a Bonus'

Chilson noted that the Anti-Terrorism Act (ATA) is the focal point in both cases but said the cases will provide justices with lots of opportunity to discuss Section 230. However, the court could ultimately decide the cases without interpreting anything about Section 230 and instead focus on the ATA: “The court took both these cases because they want to deal with the application of ATA, and Section 230 may be sort of a bonus.”

SCOTUS deciding recommendations fall outside the liability shield would mean platforms eliminating recommendations or changing their algorithms and adding human reviewers, Olson said. The former option seems unlikely, and the latter would increase platforms' costs but not eliminate their risk, he said. Section 230's power is in providing immunity that can be invoked early on, such as in a motion to dismiss, he said. If recommendations in some cases fall outside that immunity, litigation expenses will grow notably, he said.

"This wasn't the vehicle I expected to take Section 230 to the Supreme Court," said Will Duffield, Cato Institute Center for Representative Government policy analyst, since algorithmic liability seemed to be a settlement issue with the 2nd U.S. Circuit Court of Appeals' Facebook decision and there's no circuit split there. There's a lot of precedent at the Circuit court level, but SCOTUS doesn't necessarily have to follow that, he said.

There have been cases that tried to circumvent Section 230 defenses by suing over the recommendations algorithm as a type of defective product, though often the incidents that gave the plaintiff standing were in effect issues of hosting particular speech rather than the algorithm, said Duffield. Suits under the Anti-Terrorism Act have said platforms should be held responsible for radicalization, but the counter-argument has been that algorithms rely on past views and likes and are themselves neutral, he said. Any recommendation of radicalizing content is the result of garbage in, garbage out rather than steering, he said. If platforms were suddenly liable for downstream effects, the decision to not recommend content could have discriminatory effects against more controversial viewpoints or content, he said.

The concern is that existing law has developed in a way that assures people have the broadest availability to find a platform to get views out, and any sort of curtailing or narrowing of 230 protection could have consequences on what kind of content gets hosted and the wide swath of services available, said Aaron Mackey, Electronic Frontier Foundation senior staff attorney. The petitioner is asking the court to draw a clear line between traditional editorial functions, like deleting some user-generated content, and recommendations, but "there is a whole bunch of stuff in between" like prioritizing some content over others, that makes drawing a clear line difficult, he said.

It’s “worrisome” because a misinterpretation of the statute could have “grave consequences” for the tech sector and all digitally enabled portions of the economy, which is essentially the entire economy, said Computer & Communications Industry Association President Matt Schruers. All online content moderation relies on algorithmic filtering, he said: “One cannot parse out human editorial decisions from algorithmically enabled editorial decisions and think that you now have two groups. It’s one universe.”

Given its importance, Section 230 was inevitably going to reach the high court, said Schruers: “You could argue that the fact that it took 25 years is sort of a surprise.” What’s also surprising is that these cases aren’t the most “ideal” vehicle for the Supreme Court to take up Section 230, given that they largely relate to anti-terrorism law, he said.

Hill Activity Seen as Unlikely

Multiple case watchers told us they consider congressional action on revising Section 230 less likely while Gonzalez is pending because lawmakers are likely to wait and see if SCOTUS solves the problem for them. But Bipartisan Policy Center's Technology Project Director Tom Romanoff said if the GOP retakes the House this fall, going after platform content moderation policies and Section 230 will likely be a high priority for the incoming Congress.

Expect "massive" numbers of amicus briefs, and those will start to sketch out the potential scope of effects, depending on how the court rules, said intellectual property lawyer Lee Gesmer of Gesmer Updegrove. He said it would be surprising if DOJ didn't weigh in, potentially not opposing Gonzalez but proposing a narrow test for SCOTUS so even if the justices side with Gonzalez they do so on narrow grounds. DOJ didn't comment.

Tech companies and tech trade groups will weigh in, defending the Section 230 status quo, but it's less clear what some progressive groups will do, Brookings' MacCarthy said. Some have been full-throated behind Section 230, but some civil rights groups concerned about ugly behavior on platforms are increasingly restless about the idea those platforms have no legal responsibility to do anything in response, he said.

There's also a split on the right about Section 230, with libertarian and business-friendly groups in favor of less responsibility for business disagreeing with conservative groups with a more cultural focus, MacCarthy said.

One reason SCOTUS' likely direction on Gonzalez is so opaque is that it's not clear how much culture war dynamics will factor into the decision, said Eric Goldman, co-director of Santa Clara University's High Tech Law Institute. If the decision breaks down 6-3 along conservative/liberal lines, "the internet is screwed," said Goldman, who backed maintaining the Section 230 liability status quo.

The court could rule that algorithmic recommendations are excluded from Section 230 protection, but it also could go further and say any type of recommendations aren't covered, said Goldman. That would make the liability shield only apply to hosting services like Google Drive and Dropbox, he said.

Even if the court opts that Section 230 covers recommendation algorithms, Goldman said, “if they free-associate in their writing, which they usually do, it creates the possibility the plaintiffs are going to pounce on the ambiguities and still lead to a net loss." The 9th U.S. Circuit Court of Appeals' Gonzalez decision "was a mess" and the justices might have given it cert because it was so bad, with opinions structurally different from one another, he said.

If Republicans retake either chamber of Congress, there will be a slew of anti-Section 230 bills from that chamber, Goldman said. A GOP-controlled House would attack Section 230 as a primary agenda regardless of the case pending “because it's all about messaging for their voter base [and] ‘sticking it to big tech,'" he said.