DOJ Weighing Platform Incentives for Section 230, Barr Says
DOJ is considering whether Section 230 creates the right incentives for platforms to maintain a safe internet, said Attorney General William Barr Wednesday. Tech companies are no longer underdog upstarts but titans of U.S. industry, he said during a workshop on Section 230 of the Communications Decency Act: “Given this changing technological landscape, valid questions have been raised on whether Section 230’s broad immunity is still necessary, at least in its current form.”
DOJ has concerns platforms can use Section 230 to shield lawful police access to information, even when warrants are involved, he said. Victims also might be blocked from seeking civil recovery, he said. The threat of civil liability “can create industrywide pressure and incentives to promote safer environments,” he said. Senate Judiciary Committee Chairman Lindsey Graham, R-S.C., and Sen. Richard Blumenthal, D-Conn., are working on a draft legislative proposal that would hold companies civilly liable for content involving child exploitation (see 2002070052).
Courts have expanded interpretations of Section 230, “seemingly stretching beyond the statute’s text and original purpose,” Barr said: Companies have invoked immunity “even where they solicited or encouraged unlawful conduct, shared in illegal proceeds, or helped perpetrators hide from law enforcement.” Police can’t delegate their responsibility for protecting the public to for-profit private firms, he said.
Returning the internet to the days before Section 230 would result in two extremes, the Internet Association commented Tuesday: Companies would moderate nothing for fear of taking on publisher liability or over-moderate, leading to censorship. “The flourishing middle ground we enjoy today would cease to exist” without Section 230, IA said.
Panelists at DOJ’s workshop had wide-ranging views. Authors, not book stores, are liable for their content, said Charles Koch Institute Senior Research Fellow-Technology and Innovation Neil Chilson. Similarly, internet users, not the platforms, should be responsible for their content, he continued.
A platform like Facebook isn’t responsible for the murders seen in content that surfaces, but the platform should be held responsible if it’s profiting from the content, or “selling tickets to the murder,” countered University of Miami law professor Mary Anne Franks. Amplifying this content results in serious harm, including murder, she said.
It’s likely Section 230 is going to be further altered, said U.S. Naval Academy assistant cybersecurity law professor Jeff Kosseff. There’s no sense in complaining about it but rather evaluating specific alterations to the law, he said. He said large companies like Facebook and Google will be able to influence the changes.
Without Section 230, the default for content moderation is the First Amendment, which would free platforms of all liability for objectionable content, said WilmerHale partner Patrick Carome. This would give platforms “a strong incentive to bury their heads in the sand.”
Platforms don’t have incentive to act right now, said attorney Carrie Goldberg. She referenced a case in which she claimed the dating app Grindr ignored her client’s repeated requests about harassment from an ex-boyfriend. Section 230 is treated as a pass to take no action whatsoever, she said. That the ex-boyfriend wasn’t held criminally liable is a failing of the criminal justice system, said Carome. Section 230 isn’t the proper route for holding the harasser accountable, he said.
The solution is simple, said Nebraska Attorney General Doug Peterson (R). States should be enabled to enforce state laws to hold platforms accountable for online crimes, which are rapidly accelerating, he said. Industry believes states should enforce a federal standard, said Computer & Communications Industry Association President Matt Schruers.
Section 230 has flaws, but to remap a statute that has worked well for the past 25 years is unwise, Carome said. Section 230 allows the focus to remain on the wrongdoers, not the tools they use.