Consumer Electronics Daily was a Warren News publication.
Section 230

26 States, Sen. Hawley, Israeli Generals Weigh In on SCOTUS Google Case

Courts have interpreted the protections of Section 230 of the Communications Act too broadly and social media companies should be held responsible for the content recommended to users by their algorithms, said several amicus briefs filed at the U.S. Supreme Court in Gonzalez v. Google (docket 21-1333) Tuesday by advocacy groups, Sen. Josh Hawley, R-Mo., 26 states and a group of Israeli generals. “Far from discouraging terrorists, social media platforms actively assist their spread,” said the joint filing from former Israeli Minister of Defense Moshe Ya’alon and other retired Israeli military officers. “The same technology that connects dog lovers with chew-toy suppliers on social media platforms carries a parasitic byproduct that is deadly anti-social.”

The courts’ broad interpretation of Section 230 “resulted in the widespread preemption of state laws and the concomitant erosion of traditional state authority” said a joint filing from 26 states -- including Virginia, Tennessee, Colorado, California, New York and South Dakota -- and the District of Columbia supporting the petitioners. SCOTUS should interpret Section 230 narrowly on the principle of federalism, which “typically allows only the clearest textual commands to alter the balance of state and federal power,” said the states’ filing. Publisher immunities shouldn’t apply to social media companies that “actively engage and interact with their users through tailored content recommendations and other sophisticated programming,” the states said. “A straightforward application of the federalism canon weighs heavily in favor of reversal.”

"Through the years, YouTube has invested in technology, teams, and policies to identify and remove extremist content. We regularly work with law enforcement, other platforms, and civil society to share intelligence and best practices," emailed a Google spokesperson. "Undercutting Section 230 would make it harder, not easier, to combat harmful content -- making the internet less safe and less helpful for all of us."

Companies like Google are not just publishing material from users, they are exploiting it to make a profit,” said California Attorney General Rob Bonta (D) Wednesday. Bonta was part of the joint state amicus. "I urge the Supreme Court to adopt a more reasonable view of ‘publisher immunity’ under the Communications Decency Act that is in line with Congress’s intent.”

SCOTUS should distinguish publisher liability protected by Section 230 from distributor liability, said Hawley’s amicus brief in support of the Gonzalez family. The case stems from the 2015 murder of Nohemi Gonzalez during the ISIS-planned terrorist attacks in Paris. Gonzalez's family argued Google is liable for the attacks because it owns YouTube, which hosted videos used by ISIS for recruitment. Publisher liability “attaches to the making and publication of statements themselves,” while distributor liability “attaches to the dissemination of statements where a distributor knew or should have known that the statements were unlawful,” said Hawley's filing. The current interpretation of Section 230 “collapses” the distinction between distributor and publisher liability, and distributor liability should apply here, he said. “Google -- with the knowledge that ISIS was using its platform for recruitment -- continued to operate the algorithms that spread unlawful content.”

By resisting regulation and content regulation, social media companies chill speech, said the Giffords Law Center to Prevent Gun Violence in its amicus filing in support of neither party. Social media users censor themselves when posting online to avoid being targets of hate speech and harassment, said the filing. “Rather than preserving a free marketplace of ideas, social media companies have effectively put the thumb on the scale of those who shout the loudest,” the Giffords Center said. “This result runs contrary to Congress’s express purpose in enacting Section 230.” The courts should consider the technical specifics of the recommendation systems in making their decision, said Princeton University’s Center for Information Technology Policy in another brief supporting neither party. “The technical features of content moderation and recommendation systems have an important role in determining what content is made available to users,” said the CITP.

The retired Israeli generals and the child advocacy groups Common Sense Media and Fairplay argued in briefs supporting the petitioners that the recommendation engines of social media platforms psychologically manipulate users and push them toward extreme content. Social media companies use algorithms “to addict vulnerable youth to their platforms” and “construct and keep children in dangerous online environments,” said Fairplay. “Congress could not have envisioned that section 230 would extend to Google’s activities that steer vulnerable adolescents toward harmful content,” said Common Sense Media. Google’s recommendations aren’t covered under Section 230’s protections for publishers of content authored by others because they're “statements in which Google itself predicts that its users would like the identified video,” said nonprofit Free Speech for People.

The algorithms recommend “increasingly inflammatory content in an echo chamber, and nurture radicalization, terror, violence, and death,” said the former Israeli military officials, adding groups like Isis use social media to coordinate operations and secure funding. “The owners of the algorithms are in the best position to aid the fight against terrorism and should not be permitted to abet the very terrorists the rest of the world is attempting to defeat.”