As America girds for a fractious presidential election next week, security experts warn that what happens after Nov. 5 may pose as much a digital threat to democracy as the deepfakes and misinformation coursing through the internet and across social media platforms now.
The days leading up to the election have been fraught with concerns over deepfakes and misinformation, such as fake videos and websites that recently surfaced of Vice President Kamala Harris. Several of them were subsidized by GRU, Russia’s military intelligence service, which funded John Mark Dougan, an operator of several fake news websites, to allegedly circulate the fake content.
But potentially even more havoc and chaos could ensue once the votes are in and counted, and the process winds its way to the presidential inauguration in late January 2025. Nation-state enemies Russia, China and Iran are riveted on exploiting emotions that are likely to run high in the days and weeks following, especially amid charges of cheating and overturning election results, says information warfare expert Bilyana Lilly.
“What the Russians plan to do is pit Americans against each other as they did in 2016, creating protests and anti-protests over hot-button issues,” Lilly, author of “Digital Mindhunters,” said in an interview. “They are trying to stoke rebellion and conflict, as the U.S. government agencies did decades earlier to turn Latin America and European countries against communism.”
“They look at our democratic elections as a cyclical vulnerability every four years,” Lilly added. “It is an opportunity to divide us with powder keg issues like abortion and police violence.”
Indeed, some of the tactics are reinforcing bias, say experts.
“Deepfakes are typically used by those who were already predisposed to believe their messaging,” Todd Ruoff, CEO of Autonomys, said in a message. “The solution for that may simply be continuing to put forward facts and including disclaimers, or even using cryptographic signatures for authentication.”
The unrelenting drumbeat of election-related propaganda, like the barrage of political ads bombarding TV airwaves, has been emboldened by a paucity of laws and enforcement.
Swing states Pennsylvania and Nevada are among 31 states with no election laws on the books regulating how AI can be used in political content or what transparency is required to warn voters that content is AI-generated.
“The good news is that a lot of voters are aware of AI misinformation going around this election. The bad news is, it’s everywhere, and Election Day won’t be the end of it,” Brad Carson, co-founder and president of Americans for Responsible Innovation, said in an email. “We can expect more AI misinformation to flood online channels in the final stretch of this year’s election and in the weeks that follow as malicious actors try to sow the seeds of chaos.”
Questions over election integrity have ratcheted up since former President Donald Trump falsely claimed he won the 2020 election. His repeated lies, coupled with the rise of Elon Musk’s X, Facebook, TikTok and other social media vehicles that magnify information — true and false — has undercut voter confidence in free and fair elections.
“Social media has become a battlefield for Russian disinformation, particularly during election cycles when influence campaigns can reshape public opinion and even alter the course of democratic processes,” says Areig Elhag, a Washington-based journalist and television anchor. “We must ensure that platforms take their responsibilities seriously to safeguard these spaces.”
Still, the damage has already been done for many. According to Gallup polls, 57% of Americans say they are very or somewhat confident that the votes for president this year will be accurately cast and counted — down from 72% in 2004, 66% in 2016 and 59% in 2020.
Meanwhile, 91% of Americans are worried AI-driven misinformation could interfere with public perception of candidates and election outcomes, a 23% jump since earlier this year, according to a McAfee survey of 2,000 people in October. More telling: 63% of Americans encountered a deepfake in the last 60 days, and nearly half (48%) said those deepfakes swayed their voting decisions.
At the same time, SonicWall detected a 27% rise in malware-related attacks in the month leading up to the national election.
“People talk about hacking into campaigns and compromising voting machines,” Gary Barlet, public sector chief technology officer at Illumio and former chief information officer at USPS, said in an interview. “But my biggest concern is post-election. Small towns and cities don’t have the IT to cope with changes to vote tabulations. Even if vote manipulation happens at one place, it could lead to widespread unrest. One match can lead to explosions, undermining results.”
“If we undermine confidence, we sow seeds of discord,” Barlet said. “It doesn’t have to be based on the truth.”