Policy experts disagreed Monday about whether common carrier regulation should be applied to social media companies, as has been proposed in certain states. “I don’t think common carriage regulation makes sense for internet companies,” said Chamber of Progress Senior Counsel Jess Miers during a State of the Net session Monday. She noted the U.S. Supreme Court in the 1990s recognized the internet as a “wholly new medium of worldwide human communication.” It’s “different from the telephone companies and” traditional common carriages, she said. Traditional common carriers like ISPs provide one avenue to access the internet, so any restrictions can have a significant impact on a user, she said: That’s different from social media platforms because users have many options, even if they get restricted on one service. Fordham University law professor Olivier Sylvain said he wouldn’t elevate social platforms to common carriers like phone companies, which provide “essential” infrastructure. “They’re very different services that are being offered,” said Gus Rossi, Omidyar Network director-responsible technology. But it’s “not completely out of bounds to say, ‘Well, we want to look at carriage of speech in some context.’” Rossi argued the concept could be applied to social media companies, but carefully, which he said states like Florida and Texas have failed to do.
The FCC and DOJ seek a six-day extension, to Feb. 22, to reply to Maurine and Matthew Molaks’ Friday opposition to the commission’s motion to dismiss their petition for review to vacate the agency’s declaratory ruling authorizing E-rate funding for Wi-Fi on school buses (see 2402090048), said the government’s motion Monday (docket 23-60641) at the 5th U.S. Circuit Court of Appeals. The extension is necessary "to allow coordination of the reply between the agencies," it said. Counsel for the Molaks have “represented” that they won’t oppose the extension, it said. The Schools, Health & Libraries Broadband Coalition, an intervenor on behalf of the FCC, supports the extension, it said.
The D.C. government will prioritize values like safety, equity, accountability, transparency and privacy when using AI technology, Mayor Muriel Bowser said Thursday during a Microsoft-hosted event. Bowser signed an order outlining the city’s AI guidelines: societal benefit, "safety and equity, accountability, transparency, sustainability, and privacy & cybersecurity.” Her office highlighted the government's use of Microsoft 365 and Microsoft’s Azure AI Government Cloud. Microsoft hosted Bowser’s announcement at its Innovation & Policy Center in Washington. Her “commitment to embracing innovative solutions, such as AI, demonstrates her dedication to creating a more responsive and citizen-centric government,” said Fred Humphries, Microsoft corporate vice president-U.S. governmental affairs.
The FTC won’t postpone its informal hearing on an NPRM on online consumer reviews and testimonials, despite a request from the Interactive Advertising Bureau, the agency said Wednesday. The hearing is scheduled for Tuesday. IAB requested a 30-day extension, also asking the agency to reconsider its conclusion that “there are no disputed issues of material fact necessary to be resolved at the informal hearing." Agencies generally must provide 15 days’ notice in advance of such a hearing, the FTC said, and it published notice of this informal hearing on Jan. 16, 28 days in advance. In the agency’s NPRM, announced in June 2023, the agency concluded there are no "disputed issues of material fact that need to be resolved at an informal hearing.” The FTC has been “very clear” it wants to avoid exposing issues with the new rules that would block “honest opinion and violate the First Amendment,” said IAB Executive Vice President Lartease Tiffith in a statement Thursday. Just as in an informal hearing on the agency’s negative option rule, an administration judge will decide if the agency has met its burden in this case, said Tiffith: The agency is “underestimating the effects of changes to consumer reviews, and the public should have an opportunity to question its assumptions in front of an administrative judge with decision-making power.”
The Commerce Department should add ByteDance to its foreign trade restriction list to safeguard American data, Reps. Dan Crenshaw, R-Texas, and Josh Gottheimer, D-N.J., wrote Thursday. Commerce should add the TikTok parent company to the Bureau of Industry and Security’s entity list: “This step would be instrumental in applying licensing restrictions to the export of software from the U.S. to ByteDance for its applications. If American users aren't able to upgrade their app with software updates, which involves the export of U.S. software, then the operability of the applications of concern will be weakened.” They noted the department took similar action against Huawei and non-U.S. affiliates in 2019. The department didn’t comment.
The White House Office of the National Cyber Director is seeking public input on “liability” regimes for holding software companies accountable when they sell technology lacking proper cyber protections, National Cyber Director Larry Coker said Wednesday during ITI’s Intersect event. The Biden administration’s national cybersecurity strategy calls for new liability when software companies “rush insecure code to market,” Coker said. He said his office is working with academic and legal experts exploring “different liability regimes.” The strategy calls for minimizing compliance burdens on companies, so the office is working with other agencies to harmonize requirements through public feedback (see 2311030046). Coker said that in the coming weeks, his office will release a paper that addresses memory safety and software measurability.
Commerce Secretary Gina Raimondo on Wednesday announced leaders of the AI Safety Institute. The National Institute for Standards and Technology established the AISI as directed by President Joe Biden in his AI executive order. Elizabeth Kelly, White House National Economic Council special assistant to the president for economic policy, will serve as AISI's director. Chief technology officer is Elham Tabassi, a senior research scientist at NIST. They “will provide the direction and expertise we need to mitigate the risks that come with the development of this generation-defining technology, so that we can harness its potential,” Raimondo said. Raimondo is scheduled to announce members of AISI’s AI Safety Institute Consortium on Thursday. Composed of AI creators, AI users, academics, researchers and civil society organizations, AISIC will support “development and deployment of safe and trustworthy artificial intelligence,” the department said.
The U.K.’s new online speech law is one of the “most fundamental” tools allowing its telecommunications regulator to shed light on how platforms handle misinformation, Jessica Zucker, Office of Communications (Ofcom) director-online safety policy, said Monday. Speaking at a Silicon Flatirons event, Zucker said the U.K.'s Online Safety Act of 2023 is intended to help tackle online malfeasance at scale with systemic improvements across the internet. The law established a duty of care requiring companies to implement policies for removal of illegal content, like child sex abuse material, and legal but “harmful” content. Zucker stressed the law doesn’t require Ofcom to instruct tech companies to remove individual pieces of content or investigate individual complaints.
The Texas Cable Association wants the 5th U.S. Circuit Court of Appeals to hold the FCC’s Nov. 20 digital discrimination order unlawful and to set it aside, said its petition for review (docket 24-60048), filed Tuesday and posted Thursday. It shares the docket number with the petition that the U.S. Chamber of Commerce also filed Tuesday along with the Texas Association of Business and the Longview, Texas, Chamber of Commerce (see 2401300053).
Minnesota should ban dark patterns from social media platform design, Attorney General Keith Ellison (D) recommended in a report released Thursday. Minnesota’s Legislature in 2023 directed Ellison to deliver the report by February. Examples of dark patterns include features that optimize the amount of time users spend on platforms, auto-load content and notifications meant to maximize engagement, said Ellison. He highlighted the impact on teens and adolescents: “I will continue to use all the tools at my disposal to prevent ruthless corporations from preying on our children.”