Analysis reveals high probability of Starmer’s audio on Rochdale to be a deepfake

Analysis reveals high probability of Starmer’s audio on Rochdale to be a deepfake

By: nikolaj kristensen&
February 16 2024

Share Article: facebook logo twitter logo linkedin logo
Analysis reveals high probability of Starmer’s audio on Rochdale to be a deepfake

Liverpool, UK. 8th Oct, 2023.Angela Rayner Deputy Leader, Shadow Deputy Prime Minister and Shadow Secretary of State for Levelling Up, Housing and Communities at the Labour Conference 2023. Watched by labour leader Keir Starmer and members of the shadow cabinet.Liverpool UK. Picture: garyroberts/worldwidefeatures.com Credit: GaryRobertsphotography/Alamy Live News

A voice recording of what some claim to be U.K. Labour Party leader Keir Starmer is circulating on social media. 

“BREAKING: A behind-the-scenes corridor recording of Starmer about the Rochdale Azhar Ali crisis has been leaked,” reads the post that first shared the recording on February 14, 2024, and has since amassed more than 250,000 views.  

Labour recently withdrew support for Azhar Ali, the party’s candidate for the byelection in the town of Rochdale, after comments he made that Israel had allowed Hamas’ October 7 attack to happen. 

The post (archived) includes a 12-second-long audio clip in which Starmer is seemingly heard saying: “I don’t care what you guys say. Look, I think we’ve gotten away with this one. Our supporters are beyond thick. They won’t care if we just hand over and lose Rochdale. As long as we’re scoring points with Israel, they’ll be happy.” 

Many users in the comments speculate that the recording is fake, and it certainly appears to be. Assessing the authenticity of an audio clip is difficult, which makes audio deepfakes especially suitable for online misinformation.

High likelihood of fakery

Dr. Dominic Lees, a professor in filmmaking who leads the University of Reading’s Synthetic Media Research Network, told Logically Facts that one feature of the audio of Keir Starmer especially reflects a key difference between natural human speech and synthetic speech. 

“Real voices exhibit more pauses than synthetic speech, so an unusual fluency in a suspect audio can be a good indication that it is fake. This seems to be the case with the Keir Starmer clip,” he said but stressed his expertise is not in voice cloning. A computational analysis of the recording would be needed to determine with certainty that it is a voice clone. 

Logically Facts contacted two experts who tested the audio clip against four different detection models created to discern if a clip is authentic. They both found with high confidence scores that the audio is fake. 

One of the experts was Nicolas Müller, a machine-learning research scientist at Fraunhofer AISEC, a German research institute focused on security technologies. Here, they have a tool specifically for analyzing audio for deepfake traces.

“The tool uses machine learning to analyze the audio file in question,” explained Müller. “It picks up on small traces that we humans cannot identify, such as disfluencies and monotonous voice, but also other features, such as the specific fingerprint of the AI generation tool that made this fake. As is usually the case with data-driven learning, we don't fully know what features exactly the model uses to make its classification.”

The tool gave the Starmer recording a so-called deepfake score of 93.6, indicating that it has a likelihood of being a deepfake with a probability of 93.6 percent out of 100 percent.

These types of deepfake detection models do have certain limitations as they have a hard time effectively adapting to the rapidly evolving nature of new deepfakes.

Contextual evidence

While the models’ assessments cannot be used as definite proof, other signs point to the same conclusion. 

It’s not the first time the user has shared inauthentic content. Recently the same user posted a video clip of former U.S. president Donald J. Trump speaking about “King Charles’s butt cancer.” 

The Starmer audio clip is seemingly meant as a joke. Since posting it, the user has reposted several humorous takes on the clip, many apparently taking shots at investigative journalists and fact-checkers.

On X, BBC disinformation analyst Shayan Sardarizadeh said the clip was generated by artificial intelligence. “The content isn't even remotely believable as it appears to have been created for parody,” read his post.

Logically Facts has contacted the user who originally posted the audio clip. This article will be updated if and when we receive a reply.

What to look for 

Oli Buckley, a professor of cyber security at the University of East Anglia, agrees that the Starmer audio is likely inauthentic. He says it is quite well masked because a layer of background noise has been added to make it sound like it's a proper phone recording.

Buckley listed a couple of things to look out for if you come across audio you think might be fake. “It’s things like being objective, would the person say it? If it's out of character then how likely is it they did say it? Listen for the rhythm and cadence of the words, is it how the person normally talks?”

He also mentioned listening to the language used, as tools used for generating deepfakes often work on a text-to-speech process where you type what you want them to say. 

“People don't always talk how they write, meaning you get unusually formal speech, as in ‘I am’ instead of ‘I'm’ and ‘there is’ instead of ‘there's’. And often the tone doesn't match up to the words, for example, in this [the Starmer] audio, he's calling the supporters thick, but the way he says it doesn't marry up,” said Buckley. 

Not the first, not the last

It’s not the first time Starmer’s voice has been victim to audio fakes. During the Labour party conference in October 2023, a recording surfaced on social media in which Starmer was seemingly heard swearing and verbally abusing Labour staff members. Fact-checkers took pains to assess the authenticity of the audio clip, concluding that it was most likely generated by artificial intelligence. 

Logically Facts has contacted Keir Starmer. We will update this article if and when we get a comment. 

Starmer is far from the only politician recently targeted by this type of misinformation. Leading up to the state’s primaries in the U.S. in January, New Hampshire voters received a robocall where fake audio of President Joe Biden was used to urge Democrats to stay home. Similarly, the day before Pakistan’s February 8 general election, a fake audio clip of former Pakistani prime minister Imran Khan calling for an election boycott of the Pakistan Tehreek-e-Insaf Party hit social media. 

Logically Facts will continue to cover the threat of audio and other generative AI deepfakes, which experts consider one of the main misinformation challenges in the elections-heavy year ahead

Would you like to submit a claim to fact-check or contact our editorial team?

0
Global Fact-Checks Completed

We rely on information to make meaningful decisions that affect our lives, but the nature of the internet means that misinformation reaches more people faster than ever before