The Treasury Department on Tuesday announced new sanctions on entities in Iran and Russia over attempted election interference in the 2024 U.S. election – including by using artificial intelligence tools to create and disseminate disinformation.
In a Dec. 31 press release, the Treasury Department said it was imposing sanctions on a subsidiary of Iran’s Islamic Revolutionary Guard Corps (IRGC), as well as a Moscow-based affiliate organization of the Russian Main Intelligence Directorate (GRU) and its director.
“The Governments of Iran and Russia have targeted our election processes and institutions and sought to divide the American people through targeted disinformation campaigns,” said Acting Under Secretary of the Treasury for Terrorism and Financial Intelligence Bradley T. Smith. “The United States will remain vigilant against adversaries who would undermine our democracy.”
According to the department, “at the direction of, and with financial support from, the GRU,” the Moscow-based Center for Geopolitical Expertise (CGE) and its personnel used generative AI tools to create disinformation “that would be distributed across a massive network of websites designed to imitate legitimate news outlets to create false corroboration between the stories, as well as to obfuscate their Russian origin.”
Additionally, CGE also manipulated a video to produce “baseless accusations concerning a 2024 vice presidential candidate in an effort to sow discord amongst the U.S. electorate,” the Treasury Department said without naming the candidate or pointing to a specific video.
The Treasury Department also said the Cognitive Design Production Center (CDPC), which is a subsidiary of IRGC, “planned influence operations designed to incite socio-political tensions among the U.S. electorate in the lead up to the 2024 U.S. elections,” since at least 2023 on behalf of the IRGC.
This development follows concerns raised in September by top technology executives, who warned Congress of foreign efforts to influence U.S. public opinion ahead of the 2024 election – including “misinformation campaigns” being shaped by advanced AI technology.
U.S. Must Leverage Tech to Fight Foreign Election Interference
Similarly, one cybersecurity expert told lawmakers in late December that the United States needs to leverage emerging technology – such as AI – to better protect against foreign election interference perpetrated through financial crime.
During a Dec. 18 hearing held by the Committee on House Administration, Chairman Bryan Steil, R-Wis., explained that Federal law generally prohibits noncitizens or foreign businesses from directly giving to political candidates or campaigns.
However, Rep. Steil said many foreign actors are abusing a loophole that allows them to indirectly funnel money to candidates and campaigns through 501(c) organizations.
“Democrats and Republicans agree that elections should be free from foreign interference. This should not be a partisan issue,” the chairman said. “It’s imperative that we continue working to prevent foreign interference, and it starts with closing the loopholes that exist under current Federal law.”
One witness told the committee that in order to close such a loophole, lawmakers must address the regulatory gaps, but they also need to consider improving Federal technology.
Matthew O’Neill, a former managing director of the U.S. Secret Service global cyber investigative operations, explained that there are millions of suspicious activity reports (SARs) filed every year. Financial institutions must file a SAR with the Financial Crimes Enforcement Network (FinCEN) when they suspect or detect illegal activity – such as a suspected political donation from a foreign actor.
However, O’Neill said, “SARS often lack key metadata, such as IP addresses, that are critical to tracking cyber-enabled financial crimes.”
To address this challenge, O’Neill said Congress could require SARS “to include metadata, such as IP addresses and geolocation data, for better traceability.” Additionally, he said law enforcement could greatly benefit from using artificial intelligence to help sort through SARs.
“Law enforcement can leverage current SARS and CTRs [currency transaction reports] that are submitted from financial institutions. It would be ideal if more payment processors and other FinTech companies were also required to submit suspicious activity reports, because those do help law enforcement, but there are millions of SARS that are filed every single year,” O’Neill said.
“So, leveraging artificial intelligence to mine for those specific suspicious activity reports or heightened surveillance reports, whatever is provided, would be a value,” he added. “We haven’t been able to keep up with the technology.”