Walden Says Tech’s Content Liability Protection Needs Tightening
The tech industry’s content liability shield should be tightened because its protections have expanded well beyond the original intent, House Commerce Committee ranking member Greg Walden, R-Ore., told us Thursday. Congressional scrutiny of Section 230 of the Communications Decency Act is growing.
Sen. Mazie Hirono, D-Hawaii, recently requested a briefing with Google for its handling of harmful, violent content. Rep. Anthony Brindisi, D-N.Y., asked the FTC to investigate Instagram’s handling of similar, murder-related content. We received the FTC's correspondence with Brindisi through a Freedom of Information Act request.
Section 230 liability protections “have been greatly expanded beyond the authors’ original intent,” Walden told us. “Broad, sweeping exemption from any liability” was never the intent, he said. “There’s going to be a time to review 230 and to straighten it out.”
Videos showing the assassination of a 24-year-old reporter on live TV in 2015 continue to surface on YouTube, despite complaints from her father, Andy Parker (see 1908060064), said a Jan. 9 letter Hirono wrote to Google. She sought the briefing after Google failed to address some 70 videos Parker allegedly found on YouTube. Google gave the impression during a July hearing that it removed all such videos from the platform, but Parker later identified at least 67 videos, she wrote. The company didn’t comment.
Parker told us he’s preparing an FTC complaint with Georgetown University Law Center contending Google has made material misrepresentations about its services. The platform has been unfair and deceptive, he said.
Brindisi’s Aug. 7 letter cited the death of New York teenager Bianca Devins. Her murder photos were shared widely on social media, including Instagram. The legislator asked the agency to use FTC Act Section 6(b) authority to investigate whether large platforms, including Instagram, are breaking the law.
If the agency compelled information about platform content decisions, it could be challenged under the First Amendment, Chairman Joe Simons wrote Brindisi in a Nov. 13 letter. Free speech protections extend to content that may be “graphic, offensive, and violent in nature, notwithstanding the mental anguish such content may cause,” he wrote.
Simons referenced Section 230, saying it limits platform liability for user-posted content and says platforms “generally cannot be held liable for, or be required to remove, their users’ posts.” He noted businesses can be held liable for their own “material misrepresentations” about services, which is the heart of Parker’s forthcoming complaint. Simons noted that the agency’s authority allows it to police unfair and deceptive practices.
Simons’ letter suggests Section 6(b) authority doesn’t necessarily apply to Brindisi’s request. Section 6(b) allows the agency study commercial practices without a specific law enforcement purpose, he wrote. “I share your concern about the infliction of further pain on families of crime victims, such as Ms. Devins’ family, when distressing images of their loved ones proliferate on social media, despite their attempts to have the images removed.”
Hirono noted Parker repeatedly asked Google to remove content about his daughter’s murder. He claims the platform told him he must flag the content for it to be taken down.
Congress needs to take a comprehensive view of Section 230, Rep. Bob Latta, R-Ohio, told us. He’s aware House Consumer Protection Subcommittee Chair Jan Schakowsky, D-Ill., is exploring related legislation dealing with election misinformation (see 2001280059). “If you’re going to be looking at it, it should be more of a broader view,” not just pinpoints, he said.