Audio deep fakes threaten elections

2023 10 11 BRITAIN POLITICS LABOUR 1 scaled
Britain's Labour Party Leader Keir Starmer attends television interviews on the final day of the party's annual conference in Liverpool, Britain, October 11, 2023. REUTERS/Phil Noble

Audio deepfakes are emerging as a significant threat to the democratic process, particularly as more than 50 countries are having elections in 2024. The manipulation of audio content is becoming more affordable and accessible, while fact-checkers find it challenging to swiftly and definitively determine the authenticity of a recording. These recordings can circulate on social media for hours or days before being debunked, raising concerns among researchers that such deepfake content could erode trust in information, creating a political environment where voters are uncertain about what to believe.

Last week, members of the Labour Party, United Kingdom’s largest opposition party, gathered in Liverpool for their party conference. Hours before, a 25-seconds potentially explosive audio file began circulating on X. The recording, purportedly featured Sir Keir Starmer, the leader of the Labour Party, using profanity towards a staff member. The X account posted, “I have acquired an audio clip of Keir Starmer verbally mistreating his staff at the Labour Party conference. This reprehensible bully is on the verge of becoming our next Prime Minister.”

The authenticity of the audio recording remains uncertain, with speculation that it might be real, AI-generated, or produced by an impersonator. The British fact-checking organisation Full Fact is currently investigating the matter, but declared that there are certain characteristics that suggest it might be a fabrication, analysing phrases that appear to be repeated instead of using a different intonation and some anomalies in the background noise.

Starmer’s team announced that the audio is fake, and it should be ignored. They also expressed their concerns about the impact of AI and social media on democracy.

This incident follows a scandal in Slovakia’s election campaign, where an audio recording released on Facebook appeared to show the leader of the opposition Progressive Slovakia party discussing plans to rig the election. The leader denounced the audio as fake, and AFP’s fact-checking department detected signs of manipulation.

Countries worldwide are grappling with how to respond to audio recordings deemed fake. Alleged deepfake recordings have caused confusion in places like Sudan and India. In Sudan, “leaked recordings” of former leader Omar al-Bashir were suspected of manipulation, while in India, an audio recording of an opposition politician accusing party members of corruption was claimed to be machine-generated.

The problem of easily creating deepfake media is aggravated by the lack of widely available detection tools. There is no shared standard for adding watermarks or audio signals to AI-generated deepfake audios. The inability to definitively prove the authenticity of an audio recording introduces uncertainty that politicians may exploit. Politicians might claim that real audio is fake and place pressure on fact-checkers to debunk such claims, even when they lack the tools or the swift capacity to do so. Also, some audios can be used against politicians that were never really saying anything wrong.

More from Qonversations

Tech

Screenshot 2024 12 18 at 12.43.02 AM

Powering Ahead: China’s EV trucks set to disrupt the industry?

Tech

Screenshot 2024 12 16 at 5.35.03 PM

Explainer: Arm vs Qualcomm and the battle over Nuvia Tech

Tech

Screenshot 2024 12 12 at 5.28.16 PM

Is Grok the AI revolution we’ve been waiting for?

Tech

Screenshot 2024 12 10 at 2.51.00 PM

Vietnam’s EV boom: Can the charging network keep pace?