Consumer Electronics Daily was a Warren News publication.
May 14 Buffalo Shootings

Section 230 Gives Platforms ‘Too Much Legal Immunity,': NY AG Report

Tech companies “are largely free to create and operate online platforms without legal consequences for the negative outcomes of their products” because of Communications Decency Act Section 230, said an investigative report Tuesday from the Office of the New York Attorney General on the role of online platforms in the May 14 mass shooting in Buffalo that killed 10 and wounded three. Section 230 allows “too much legal immunity” for platforms, even “when a platform allows users to post and share unlawful content,” it said.

The framers of Section 230 “wanted to encourage content moderation by providing protections to online platforms acting as ‘Good Samaritans’ in their attempts to monitor and remove offensive content on their platforms,” said the report. But the internet “has changed dramatically” since the statute was enacted in 1996, it said. In that time, courts have found that Section 230, which protects companies from liability for the under-removal of user content, “should be construed broadly in favor of immunity,” it said.

Though there are some “discrete areas” where Section 230 protections don’t apply, “by and large the Section 230 case law has made it difficult to hold companies responsible for their failure to take even basic steps to monitor and ensure the safety of their platforms,” said the report. “Therefore, we are calling on Congress to reform Section 230.”

While “broader reform” of Section 230 may be necessary, “our proposal focuses on the violent and extremist content at issue here,” said the report. “We recommend that Congress rethink the ready availability of Section 230 as a complete defense for online platforms’ content moderation practices.” It suggested a fix that requires an online platform seeking to retain Section 230 protections “to take reasonable steps to prevent unlawful violent criminal content from appearing on the platform.”

The proposal significantly “changes the default,” said the report. “Instead of simply being able to assert protection under Section 230, a defendant company has the initial burden of establishing that its policies and practices were reasonably designed to address unlawful content.” The suggested reform “would incentivize companies to establish robust content moderation programs, but would not penalize those companies if violative content slips through despite such programs,” it said.

Reasonableness” must account for the prevailing technology, “including whether companies are making investments in and deploying the same level of sophisticated technology to content moderation as they are for other parts of their business,” said the report. “This would help establish a baseline of content moderation across online platforms, helping to ensure a safer experience for all users.”

The report recommends that Congress should authorize the FTC to draft regulations “providing guidance on the reasonable steps that an online platform must take to obtain Section 230 protection,” or it should create a new agency dedicated to regulating online platforms. “At a minimum, reasonable steps should include efforts to remove unlawful violent criminal content and content likely to incite or solicit violent crime,” it said.

Also on the table for Section 230 revisions should be measures “to prevent the platform from being used to encourage or plan acts of violence,” it said. It also recommended limits on livestreaming technology “designed to prevent its use to further criminal acts and incite or solicit violent crimes.”