Imran Khan's PTI to boycott polls? Deepfake audio attempts to mislead voters in Pakistan

Imran Khan's PTI to boycott polls? Deepfake audio attempts to mislead voters in Pakistan

By: anurag baruah&
February 8 2024

Share Article: facebook logo twitter logo linkedin logo
Imran Khan's PTI to boycott polls? Deepfake audio attempts to mislead voters in Pakistan

Imran Khan has not asked his party PTI to boycott the current general elections in Pakistan. (Source: Wikimedia Commons/X/Modified by Logically Facts)

On February 7, 2024, just a day before Pakistan's highly anticipated general elections, from which former Prime Minister Imran Khan was barred because of a graft conviction, a voice recording alleged to be of the imprisoned popular leader circulated on social media.

This audio clip, purported to feature Khan calling for an election boycott by the Pakistan Tehreek-e-Insaf (PTI), was disseminated by several social media accounts late in the evening on X (formerly known as Twitter).

Soon after the audio emerged on social media platforms, PTI took to X to clarify from the party’s official account that the audio was fake, alleging that "the controlled media is being used to run fake news about PTI boycotting elections, along with running a fake audio!"


In a conversation with Logically Facts, PTI leader Zulfi Bukhari asserted that the audio was entirely fabricated and dismissed any notion of a boycott. Bukhari emphasized, “Imran Khan and his party only have had one demand for the past two years. That is for free and fair elections. Although these elections are nowhere close to free and fair, they would have boycotted had it been any other country or party. We have maintained to contest as we know the people of Pakistan are overwhelmingly with us.”

‘AI voice, unnatural noise’: Experts analyze the audio

Two independent experts consulted by Logically Facts confirmed with a high degree of certainty that the audio was artificially generated and did not originate from Khan.

Tanmay Srivastava, an expert in audio forensics, identified several anomalies within the viral audio clip. He noted the presence of an unnatural white noise throughout the recording, likely added post-production to simulate authenticity.

“It seems like the noise was introduced after the AI voice was created because it is not properly mixed with the actual audio recording,” Srivastava added.

Srivastava also observed that the intonation of Khan’s voice throughout the clip was monotone, with no natural variation in pitch or tone that would typically be present in a genuine recording.

“All the sibilant consonants that produce high frequency sounds like s's and t's, or “esses” are not sounding natural, the way they should sound in a normal recording… they sound mechanical or highly processed,” he added

Srivastava highlighted specific timestamps where the audio appeared heavily edited or manipulated to correct pronunciation or to add authenticity, indicating a sophisticated level of digital alteration.

“At the 0:38-0:39 mark, the voice cracks due to multiple layering to fix the pronunciation of the word (sibilant consonants). Unnatural noise was added at 0:02-0:03, 0:28-0:29, and 1:15-1:17 to make the audio look more genuine. Also, at 1:03-1:05, unnatural-sounding sibilant consonants can be heard,” Srivastava said.

Further analysis by experts from IIT Jodhpur, led by Professor Mayank Vatsa, utilized deep-learning models to evaluate the audio's authenticity. Their findings, which they shared with Logically Facts, were based on four distinct models and indicated with high confidence that the audio was a fabrication created using AI technology, with two models assigning confidence scores above 0.9 and one reaching the maximum score of 1.

Surge of deepfakes in Pakistan elections

The proliferation of deepfake technology has become a significant concern globally, with numerous instances of its application in disrupting election campaigns.

Pakistan's election has been no exception, witnessing several deepfake videos and audio clips designed to mislead the public, including false claims of an election boycott by the PTI party. Another example of this, in addition to Khan’s audio clip, is of a potentially deepfake audio, purportedly from a leaked group conversation, citing the strategy behind PTI’s potential boycott of the general elections.

PTI has actively countered these narratives, with officials clarifying the authenticity of videos and audio clips attributed to them. Recently, PTI leader Muhammad Basharat Raja took to X to state that a video of him stating that he would boycott the elections was an AI-generated deepfake. He said that he would be standing in the current election.

Meanwhile, the Information Secretary of PTI London flagged another digitally altered video of a PTI leader purportedly announcing a similar boycott.

Interestingly, PTI had previously acknowledged using AI technology in December 2023 to create an audio message from Khan, utilizing text he had written in prison, indicating the complex role of digital innovations in modern political discourse. However, the party had said that the audio was generated after the approval of his lawyers. 

A repeat of the Bangladesh elections?

The issue of deepfakes is not confined to Pakistan. The recent Bangladesh elections also encountered similar challenges, with deepfake videos of independent candidates falsely declaring their withdrawal from the race, underscoring the global penetration of AI technology in electoral processes.

As the world braces for elections in over 50 countries in 2024, the escalation of deepfake technology underscores an urgent need for vigilance and verification to preserve the integrity of democratic institutions.

Would you like to submit a claim to fact-check or contact our editorial team?

0
Global Fact-Checks Completed

We rely on information to make meaningful decisions that affect our lives, but the nature of the internet means that misinformation reaches more people faster than ever before