Video of Muhammad Yunus Announcing Resignation as Bangladesh Chief Is Fake

September 19, 2024
September 19, 2024
Manipulated Media/Altered Mediablog main image
Screengrabs of the video analysed by the DAU

The Deepfakes Analysis Unit (DAU) analysed a video featuring the Nobel laureate Muhammad Yunus, chief adviser of Bangladesh’s interim government, apparently announcing his resignation. After we put the video through A.I. detection tools and sought analysis from our expert partners, we were able to conclude that an original video featuring Mr. Yunus was fabricated with synthetic audio.

The 16-second video in English, embedded in a Facebook post, was sent to the DAU by a fact-checking partner for analysis. The video opens with visuals of Yunus speaking to the camera in an office-like setting with two flags visible in the backdrop; the national flag of Bangladesh and another flag bearing insignia and text inscribed in Bengali, which translates to chief adviser. Miniature versions of the flags are also visible in the foreground.

A male voice recorded over his visuals makes it sound as if Yunus is resigning, handing over power to another individual and also announcing the dismissal of the army chief. A logo featuring the colours of Bangladesh’s national flag and text in Bengali is visible on the top left corner of the frame throughout the video. The logo resembles that of the state-owned broadcaster, Bangladesh Television or BTV.

The Facebook post carrying the video was published on Sept. 13 by someone whose profile information suggests that they live in Dhaka; the text in that post written in Bengali asks readers to listen carefully to what is being referred to as the chief adviser’s resignation. 

The overall video quality is poor. Despite that, it’s visible that Yunus’s lip movements in the video track do not match the words that can be heard in the audio track. However, the visible pauses in his lip movements mostly appear to align with the pauses in the audio track. The editing and transitions in the video are patchy, resulting in jump cuts.

The voice in the video sounds somewhat similar to his, based on a comparison with his voice as heard in his publicly available speeches and interviews. Though, the accent and the overall style of delivery is different. There are no variations in the tone and pitch of the speech that can be heard in the suspicious video, making the audio sound scripted. Background noise, which can be heard throughout the video, gets louder each time there is a pause in the recorded speech.

We ran a reverse image search using screenshots from the video and traced this video, published from the official Youtube channel of BTV on Sept. 11. Yunus’s clothing, backdrop and body language in the original video and the doctored video are identical. However, his speech is in Bengali in the original video, wherein he is laying out the roadmap for Bangladesh's future.

The video frames in the doctored video also appear cropped as some of the foreground details visible in the original video are missing, such as the table in front of Yunus and him resting his hands on it.

To discern if A.I. was used to manipulate the video, we ran it through A.I. detection tools.

The voice detection tool of Loccus.ai, a company that specialises in artificial intelligence solutions for voice safety returned results which indicated that there was a high probability that the audio track in the video was A.I.-generated.

Screenshot of the analysis from Loccus.ai’s audio detection tool

Hive AI’s deepfake video detection tool did not detect any manipulation using A.I. in the visuals, however, their audio tool indicated tampering using A.I. in portions toward the end of the audio track.

Screenshot of the analysis from Hive AI’s deepfake video detection tool

The deepfake detector of our partner TrueMedia suggested substantial evidence of manipulation in the video. In a breakdown of the overall analysis, their tool gave a 100 percent confidence score to the subcategory of “voice anti-spoofing analysis”, indicating that it is highly likely that the audio was created using an A.I. audio generator. The subcategory of “audio authenticity detector”, gave a 99 percent probability that the audio track was either a voice clone or generated by an A.I. generator. The subcategory of “AI-generated audio detector”, which detects A.I.-generated audio, received a confidence score of 60 percent.

The tool gave a 59 percent confidence score to “video facial analysis”, a subcategory that analyses video frames for unusual patterns and discrepancies in facial features. The tool also gave a confidence score of 53 percent to “face manipulation detector”, which detects the likelihood of A.I. manipulation of faces as in the case of face swaps and face reenactment; in this case it’s Yunus’s face of course.

Screenshot of the overall analysis from TrueMedia’s deepfake detection tool
Screenshot of the audio and video analysis from TrueMedia’s deepfake detection tool

For a further analysis on the audio track we also put it through the A.I. speech classifier of ElevenLabs, a company specialising in voice A.I. research and deployment. It returned results as “very unlikely", indicating that it was highly unlikely that the audio track featured in the video was generated using their software.

We reached out to ElevenLabs for comment on the analysis. They told us that they were unable to confirm that the audio was A.I.-generated. They added that they have been actively identifying and blocking attempts to generate prohibited content, however, the exact generation of this audio from their platform has not been confirmed.

To get another expert analysis on the video, we escalated it to the Global Deepfake Detection System (GODDS), a detection system set up by Northwestern University’s Security & AI Lab (NSAIL). They used a combination of 22 deepfake detection algorithms and analyses from two human analysts trained to detect deepfakes, to review the video escalated by the DAU. 

Of the 22 predictive models used to analyse the video, 14 models gave a higher probability of the video being fake, while the remaining 8 models indicated a lower probability of the video being fake.

In their report the team noted that the audio appears misaligned with the video of the subject speaking and that could suggest that the media is inauthentic. They spotted moments when the audio continues even as the subject no longer appears to be talking, something that the DAU was unable to spot given the poor video quality.

Their analysis also pointed to jump cuts in the video. They even observed that the perimeter of the subject’s head is blurry, particularly around the hairline and the top of the chair on which he is seated, which contributes to an unnatural appearance.

To get another expert view, we sought the expertise of our partners at RIT’s DeFake Project. Saniat Sohrawardi and Kelly Wu from the project stated that the video appears to be a cheapfake with generated audio.

The team mentioned that the audio sounds fairly robotic and does not sound very much like Yunus. They noted that given the subject’s age, he takes more pauses and has more variations during his normal speeches.

The team also noticed a sharp cut in the video at a certain point while the subject appears to be speaking, they said that it points toward tampering as it is not a zoom or a pan. They further pointed to the subject’s mouth being fairly out of sync from the actual audio in most places in the video.

Based on our observations and expert analyses, we can conclude that Yunus did not utter the words being attributed to him, and that original footage was used with A.I.-generated audio to create a fake narrative about his resignation.

(Written by Debopriya Bhattacharya and Debraj Sarkar, edited by Pamposh Raina.)

Kindly Note: The manipulated audio/video files that we receive on our tipline are not embedded in our assessment reports because we do not intend to contribute to their virality.

You can read below the fact-checks related to this piece published by our partners:

Fact Check: পদত্য়াগ করলেন বাংলাদেশের অন্তর্বর্তী সরকারের প্রধান মহম্মদ ইউনুস? ভাইরাল ভিডিয়োর সত্য়তা জানুন (Bengali)

ডিপফেক ভিডিও দিয়ে প্রধান উপদেষ্টার পদত্যাগের ভুয়া খবর প্রচার (Bengali)