Consumer Electronics Daily was a Warren News publication.
‘Extraordinary’ Power

Blumenthal Wants AI Commission; Hawley Focuses on Google, Microsoft

Congress needs to establish a national commission to license and audit AI companies, Senate Privacy Subcommittee Chairman Richard Blumenthal, D-Conn., said Tuesday.

The subcommittee continued its exploration of AI legislation at Tuesday’s hearing. Blumenthal introduced the No Section 230 Immunity for AI Act in June with ranking member Josh Hawley, R-Mo. Hawley told us to expect other pieces of legislation (see 2307170054) from subcommittee leadership. Blumenthal said he has concluded that a national commission is needed to develop countermeasures against dangerous, automated decision-making. Sen. Amy Klobuchar, D-Minn., said she’s also engaging with Hawley. Blumenthal and Klobuchar said Congress can’t repeat the mistakes it made in allowing the tech industry to avoid liability under Communications Decency Act Section 230.

Hawley raised concerns about Google, Meta and Microsoft further entrenching their dominance through generative AI products. He noted Google reportedly invested $300 million in Anthropic and has a 10% stake in the company. Microsoft’s reported stake in OpenAI is 49%, he said. He asked Anthropic CEO Dario Amodei if there are plans to integrate the company’s AI text generator with Google Search. Amodei said there currently aren’t any plans to do so, and the relationship between the two companies is focused on hardware. The power of Google having an AI text generator to push content to users would be “extraordinary,” said Hawley. He argued the interplay between the two is only likely to increase privacy abuses.

Amodei testified that Congress should require watermark labels on AI-produced campaign content. Amodei recently attended a White House event with other tech executives, where the companies agreed to voluntary measures to ensure AI develops safely and securely (see 2307210043). University of Montreal computer science professor Yoshua Bengio agreed on the need for watermarks. Bengio also said social media accounts should be restricted to human users who have verified themselves. Companies might push back on such a proposal because it could make it harder to accumulate users and content, he said.

Equal Employment Opportunity Commissioner Keith Sonderling, speaking during a Federalist Society event, said his agency is addressing employment issues for AI. AI can discriminate against employment applicants on a scale far wider than any individual HR professional, he said. He warned companies they’re going to be liable for their systems’ discrimination, whether or not they intended to discriminate. It’s “crucial” that companies keep this in mind as they rely more heavily on AI technology, he said. If the training data skews heavily toward one race, sex, religion or national origin, these protected characteristics could play an improper role in HR decisions, he said.

Committee for Justice President Curt Levey warned against asking AI models to cure society’s ills. If AI systems are blocked from drawing certain conclusions based on gender or race, it could make the technology less accurate, he said: For example, a parole system relying on language models might rely on data that tells it males are more likely to be repeat offenders. The language model could become more “error-prone” if you remove that correlation, he said. Levey suggested AI bias might be downstream from a larger cultural issue, as the systems rely on societal data and reflect only societal norms. Systems that rely on historical data rely on data sets built on a history of discrimination, said Gary Marcus, New York University professor emeritus-psychology and neural science: “I think we can and should ask for more.”

The Computer & Communications Industry Association argued against creating a single AI regulatory agency because it would lack the "expertise" needed to address the various issues related to AI. “Existing agencies should seek to incorporate AI expertise as they regulate AI within their own areas of authority,” said Senior Counsel-Innovation Policy Josh Landau.