This site uses cookies to ensure the best viewing experience for our readers.
‘What a bunch of malarkey’: Deepfake robocalls mimic Biden's voice, sparking election manipulation fears

‘What a bunch of malarkey’: Deepfake robocalls mimic Biden's voice, sparking election manipulation fears

Ahead of the New Hampshire Republican primary, voters received phone calls impersonating President Biden, prompting fears that such deepfake robocalls will be increasingly used to manipulate voters in the 2024 US presidential election.

Omer Kabir | 13:15, 24.01.24

It’s Election Day. You've just started your day, and before you’ve even had breakfast your phone rings. You may hear either the voice of the prime minister himself or one of his main opponents instructing you not to go out and vote but to do it on another day for a particular reason. And thus, you are convinced to stay home and abdicate your voting rights. But you didn’t actually hear the politician's voice, but a deep fake - an AI-based copy which sounds exactly like them, activated by their political rivals to suppress voting and manipulate election results.

President Joe Biden. President Joe Biden. President Joe Biden.

That may sound like science fiction, but it's almost precisely what happened this week to voters in New Hampshire leading up to the Republican primary to elect their candidate for the 2024 U.S. presidential election. Experts believe this is just the beginning of what could be a widespread campaign based on AI manipulation, with this being just the tip of the iceberg.

According to NBC News, voters in New Hampshire received phone calls on Sunday from a voice belonging to President Joe Biden. The call began with, "What a bunch of malarkey," a phrase associated with Biden. "It's important to save your vote for the November elections. Voting this Tuesday only enables the Republicans in their quest to elect Donald Trump again. Your vote makes a difference in November, not this Tuesday." No one knows who is behind the calls, but they showed up to recipients as coming from the personal cellphone of Kathy Sullivan, a prominent member in the Democratic Party.

However, it wasn't actually Biden on the line - it was an automated robocall using deepfake AI technology to precisely mimic the president's voice and pronunciation in an attempt to influence election results. There has been an increasing public awareness about deepfake applications in the last five years, including due to some famous incidents such as a 2019 deepfake interview of Mark Zuckerberg in which he boasted that he held the secrets of billions of people and a deepfake speech attributed to former U.S. President Richard Nixon announcing the Apollo 11 moon landing.

Related articles:

AI-based generative adversarial networks (GenAI) have already been used to create ads as part of the current election campaign. But now, such technology is being used to directly disrupt the voting process itself. Experts are rightly concerned that this is just the beginning of wider disruption efforts, using deepfakes and social media to distort the results of what is expected to be an intense and emotionally charged election.

Whoever is behind the Biden deepfake calls remains unkown and the matter is under investigation by the Attorney General's office in New Hampshire. "Although the voice in the robocall sounds like the voice of President Biden, this message appears to be artificially generated based on initial indications. These messages appear to be an unlawful attempt to disrupt the New Hampshire Presidential Primary Election and to suppress New Hampshire voters. New Hampshire voters should disregard the content of this message entirely," said the Attorney General's office.

Sullivan believes that those behind the calls aim to harm Biden's election campaign. "I want them to be prosecuted to the fullest extent possible, because this is an attack on democracy," she told NBC News. "I’m not going to let it go. I want to know who’s paying for it. Who knew about it? Who benefits?"

Nomorobo, a company developing a service to block spam and robocalls, estimated that the number of voters receiving the call was relatively low, standing at less than 5,000. However, founder and CEO, Aaron Foss, says that this is just the beginning and that such calls will play a central role in the 2024 elections. "We've known this day was going to come and now it's here," he told Business Insider. "This is always what happens. Scammers jump on anything new that they can use to get a leg up to go in and rip off."

Jonathan Nelson, the director of product management at Hiya, a company that provides spam and fraud call protection, said, "I would say no question in my mind, this will be the roughest presidential year that we've ever seen as far as robocalls and activity like this.” He added that 2024 will be particularly bad, not only because it's an election year but also because today, scammers have access to more advanced technologies. AI has become cheaper for consumers to access, leading to consistent improvement in deepfake technology. Even if you trust the phone number, “you can’t even trust the voice,” he said.

Professor Hany Farid of University of California at Berkeley, who studies digital propaganda and misinformation, told The Washington Post that he anticipates such use of technology has been on the horizon for some time. "We are concerned that GenAI will be widely used in the upcoming elections," he told AP. "What we are seeing now is a sign of things to come."

GenAI models provide a significant upgrade to the ability of scammers to use deepfakes. Their broad accessibility and flexible capabilities make the production cost exceptionally low, requiring much less expertise. These models can be utilized to generate texts and audio recordings in various scopes while accurately mimicking the speaking style of the person whose voice or persona they seek to forge - try asking ChatGPT to write a short speech in the style of Donald Trump or Barack Obama, and you'll see. The results can be disseminated through automated calls, as was done in New Hampshire, or on social media in the hope of achieving widespread virality. The potential uses can be diverse, ranging from calls advising not to vote on Election Day with various explanations, to videos on which candidates say or do things they never have before in an attempt to embarrass them or disillusion voters.

There are attempts underway to respond to this threat. Legal authorities will investigate and try to identify and halt these operations. Social media platforms, if not lax in safeguarding like they were in 2016, will detect and remove deepfake content. Fact-checking and news organizations will scrutinize and debunk them. Most people exposed to such content, one must hope, will choose not to believe it or recognize it as a scam. However, in the presidential campaign, which can be decided at the state or even district level, one doesn’t need to persuade the majority of voters - it is enough to reach a small number. A phone call from Biden to voters in Atlanta, Georgia on Election Day, urging them not to go out to vote because those ballots were rejected during the week, could convince enough of them to stay at home, potentially securing the county, the state, and the presidency for the Republican candidate, likely Trump.

David Becker, the Executive Director and Founder of the Center for Election Innovation and Research, and a former senior official in the U.S. Department of Justice, estimated that the expected deepfake attack in the upcoming elections could have a much deeper impact on the electorate. He said that it’s hard to determine whether the purpose of the calls in New Hampshire was to suppress voting or simply to “continue the process of getting Americans to untether themselves from fact and truth regarding our democracy," he told AP. "They don’t need to convince us that what they’re saying, the lies they’re telling, are true. They just need to convince us that there is no truth, that you can’t believe anything you’re told."

share on facebook share on twitter share on linkedin share on whatsapp share on mail

TAGS