Consumer Electronics Daily was a Warren News publication.
Bill Critiqued

Facebook, Google, Twitter CEOs Offer Mixed Takes on CDA S. 230

Communications Decency Act Section 230 “would benefit from thoughtful changes,” Facebook CEO Mark Zuckerberg plans to tell House Commerce Committee members during Thursday’s virtual hearing (see 2103190054). Google CEO Sundar Pichai defends the statute in prepared testimony, saying recent proposals could have unintended consequences. Twitter CEO Jack Dorsey instead focuses on transparency, procedural fairness, algorithmic choice and privacy.

"Identifying a way forward" to make 230 "work better for people" is a challenge amid "the chorus of people arguing -- sometimes for contradictory reasons -- that the law is doing more harm than good,” says Zuckerberg. He committed to being a “productive partner” in these discussions.

Recent proposals, including calls to repeal the portion, wouldn't serve the open internet well, says Pichai: Many would harm “both free expression and the ability of platforms to take responsible action to protect users in the face of constantly evolving challenges.” He notes that regulation has an important role in “addressing harm and improving accountability.”

Dorsey echoes remarks his company made previously. Principles are better transparency, procedural fairness so content moderation mistakes are corrected, algorithmic choice for empowering users, and strengthened user privacy. “I agree with this Committee that technology companies have work to do to earn trust from those who use our services,” he says.

Rolling back” Section 230 will cement Facebook as the “dominant social media company and make it vastly harder for new startups to challenge [Zuckerberg's] cash cow,” said Sen. Ron Wyden, D-Ore., Wednesday. “Everyone working to address real issues online should be deeply wary about" his proposals.

Experts Respond

Observers questioned the wisdom of rolling back 230 protections and allowing consumers to sue with a private right of action, as House Consumer Protection Subcommittee Chair Jan Schakowsky, D-Ill., said she intends to do with new legislation (see 2103220060). Her Consumer Protection Act would clarify the statute and require platforms to establish and disclose terms of service for how they handle misinformation and other content moderation issues, she said earlier this week.

It would likely be more effective to encourage the FTC to better police terms of service through its unfair and deceptive act authority than to expose companies to consumer lawsuits, said New York University Stern Center for Business and Human Rights Deputy Director Paul Barrett based on what Schakowsky announced. He agreed that companies need to be held more accountable for keeping promises on terms of service, but a private right of action might lead to “more chaos” than systematic enforcement.

Turning each content moderation decision, “of which services may make billions in a day, into the basis for potential litigation seems like an efficient way to eliminate the entire genre of user-generated content,” said Santa Clara University law professor Eric Goldman. Schakowsky “may think that the legal exposure will improve sites' decision-making, but in fact it creates a no-win game where the only smart response is not to play.”

The harm really stems from the fact that they will have to go to court and defend every decision a user has a problem with,” said Jeffrey Westling, R Street Institute resident fellow-technology and innovation. “The existing intermediary liability regime basically says that if a platform engages in no moderation, they don't have the requisite knowledge for liability.” That means risk-averse platforms will either over-moderate or not moderate at all, making the issue of misinformation worse, he said.

This bill is likely to be one of many laws proposed this year for holding platforms “responsible” for content and carving away immunity, said University of Florida Brechner First Amendment Project Director Clay Calvert. “Certainly some legislation -- likely watered down from its original form -- will make it through both the House and the Senate in 2021.”

Groups Respond

Industry and other groups issued statements before the hearing, while others spoke in interviews.

Consumer groups want members to question the CEOs about issues affecting minorities, particularly Facebook’s moderation of Spanish content. The platform is removing English content but allowing Spanish content to go “untouched,” said Free Press Senior Policy Counsel Carmen Scurato: The disparity in enforcement is of “grave concern” and needs to be addressed if Facebook wants to be a global company. She noted that her organization recently held a press call with Rep. Tony Cardenas, D-Calif., whom she expects to raise the issue during the hearing. His office didn’t comment. Section 230 needs to be clarified, so it doesn’t absolve companies from the harm they’re causing through their own actions, she said.

The Center for Democracy & Technology sent possible questions for members about disinformation campaigns targeting people based on race and gender, said a CDT spokesperson. CDT wants information on what actions companies are taking to ensure platform design doesn’t “amplify disinformation based on racist or misogynistic content.”

The Multicultural Media, Telecom and Internet Council doesn’t support the repeal of Section 230 because it would be detrimental to small businesses and minority voices, said CEO Maurita Coley. But MMTC supports guardrails for continued immunity, she said: Platforms should not be immune when letting users “create and spread discriminatory content like hate speech,” but they should be immune when working to “prevent users from creating and spreading discriminatory content like hate speech.”

If 230 were weakened or eliminated, “it would be vastly more difficult to control and remove misinformation online,” said CTA CEO Gary Shapiro. The statute enables platforms to “remove disinformation and extremist speech without being hauled into court for every moderation decision,” he said.

Rather than intervene and make things worse, Congress should allow industry to develop moderation policies that suit users, NetChoice said in a comment to the committee. Congress should shine a spotlight on disinformation issues through its platform, the organization said: “By suggesting the problem is confined mainly to websites, Congress risks lulling Americans into falsely believing that what they see is what they should believe -- so long as it’s not online.”

Weakening laws to allow more lawsuits from users would unfortunately have a chilling effect on the newer and smaller social media companies, and result in fewer options for posting online content,” said Computer & Communications Industry Association President Matt Schruers.