US Needs 'Creative’ Ways to Restrict AI Transfers to China, Experts Say
U.S. policymakers should explore new ways to restrict transfers of items and services that China may be using to advance its artificial intelligence capabilities, such as data, algorithms and human capital, the Center for a New American Security said in a report this week. Although the administration should “aggressively” restrict exports to China of advanced semiconductor equipment, the report said Washington also needs to “seek out creative tools to regulate other basic building blocks of AI.”
The report argues for “bold action to constrain China’s progress in AI for military and repressive purposes.” Existing semiconductor controls -- such as those placed on “cutting-edge chips” and “know-how,” should continue, it said, but more is needed.
“The reasons for action are clear: China succeeding in its military AI ambitions and gaining a sizable advantage in the commanding heights of military-technological power threatens to make an already-serious security threat worse,” said the report, authored by CNAS adjunct fellow Alexander Sullivan, project assistant Noah Greene and senior fellow Jacob Stokes, who worked on the national security staff of then-Vice President Joe Biden.
The report suggested U.S. policymakers could soon place controls on exports or disclosures to China of “general-purpose algorithms,” such as large language models, which would be a departure from existing restrictions. “So far, U.S. efforts to deny China the fruits of military AI have focused on compute -- the advanced semiconductors that process data for AI systems -- while addressing other basic building blocks of AI, including data, human talent, and algorithms, using far more targeted tools,” the authors said. “However, Washington may in the future place limits on these other categories.”
They pointed to the U.S. effort to ban TikTok from operating in the U.S., a campaign partially driven by lawmakers who fear the app is providing the Chinese government with Americans’ personal data (see 2307210046). The U.S. “could also move in the future to prevent data that would be more relevant to military AI from flowing to China,” the report said, adding that the government has “already placed limited restrictions on exporting or disclosing the source code of AI algorithms designed for geospatial analysis.” The Bureau of Industry and Security in 2020 placed controls on certain geospatial imagery software (see 2201050027 and 2001030024).
Washington could introduce similar AI-related guardrails around inbound and potentially outbound investments, the report suggested. “For example, the Committee on Foreign Investment in the United States (CFIUS) would almost certainly reject any attempt by a Chinese entity to invest in OpenAI.”
The authors also cautioned against broad export controls, saying the U.S. should “restrict itself to technologies with clear military (and dual-use) and repression applications and continually refine its policies to ensure their effectiveness while avoiding overly broad restrictions that end up being self-defeating." Some restrictions, if not properly tailored, could undermine the U.S. AI ecosystem by allowing products from foreign competitors to replace items offered by American technology companies, the report said, adding that most controls will be “ineffective” without allies imposing similar measures.
“U.S. companies could lose out on valuable commercial opportunities," the report said, "only to be backfilled by foreign competitors.”
Washington should look to “build consultations on these issues” into its existing alliances, such as the North Atlantic Treaty Organization and the Australia-U.K.-U.S. (AUKUS) partnership. “Early discussions of military AI in NATO, the AUKUS partnership, and bilateral alliances with Japan and South Korea should be expanded, including potentially into” the Group of 7 nations, the report said. Those forums “offer constructive places to hash out key tactical questions with like-minded partners, such as whether agreements should regulate specific technologies or instead focus on regulating certain outcomes.”