Video of Sudha Murty Promoting a Financial Program Is Fake

October 8, 2024
October 5, 2024
Manipulated Media/Altered Mediablog main image
Screengrabs of the video analysed by the DAU

The Deepfakes Analysis Unit (DAU) analysed a video that features Sudha Murty, founder of the Infosys Foundation — philanthropic arm of the tech giant Infosys — apparently unveiling a financial project jointly with her husband N.R. Narayana Murthy, co-founder of Infosys. After putting the video through A.I.-detection tools and getting our expert partners to weigh in, we were able to conclude that the original video featuring Mrs. Murty was manipulated with an A.I.-generated audio track. 

The two-minute-and-54-second video in English, embedded in a Facebook post, was sent to the DAU by a fact-checking partner for verification. The video opens with visuals of Murty, also a member of the Rajya Sabha, speaking into a hand-held microphone while interacting with the audience at what appears to be a public event. Mr. Murthy — the couple use different spellings for their last name — can be seen seated beside her.

Separate boards bearing the words “Startup Conclave” and “Program Infosys” are visible in the backdrop with “Infosys” typeset in a manner similar to the logo of the tech giant. Murthy does not speak in the video while a female voice accompanying the visuals of his wife, Sudha, can be heard making a case for a supposed financial project. The project is touted as a highly profitable and unique opportunity for Indians to make money regularly with no risk involved.

The female voice claims the project to be the brainchild of Murthy and mentions that it is driven by a computing mechanism that ensures maximum profit. The same voice encourages the audience to sign up for the project urgently with the reassurance that  Murty and her husband use the same financial model as well.

A logo resembling that of Moneycontrol, a business news and analysis website, is visible in the top right corner throughout the video. Visuals of crowds filming Murty and applauding her flash in some frames, and as the video ends a black screen fades in with the text : “take your chance today”.

No oddities are noticeable in the frames that feature Murthy. However, there are many inconsistencies noticeable on his wife’s face despite the poor quality of the video. In the close-ups her lip-sync is clearly imperfect. Though her mouth movements are barely visible when the camera is zoomed-out the dissonance between the audio track and the way her lips move can still be noticed.

At a few instances a handheld microphone can be seen disappearing into her lower jawline, and the shape of her chin appears to be distorting. In one frame, the lower part of her mouth seems to move unnaturally as she speaks. Her teeth briefly seem to disappear in a close-up, and a quiver is visible around her mouth when the camera is zoomed-in on her face.

Subject’s mouth moves unnaturally as the microphone disappears into her lower jawline

The voice in the video somewhat sounds like Murty’s when compared with her recorded interviews and speeches. Her lisp is faintly captured in that voice and so are the pauses characteristic of her style of speaking. However, those pauses are followed by a slight noise, possibly an attempt to mimic the human breath. Her usual accent is missing from the audio. The overall delivery sounds scripted, lacking natural intonation and pitch, with some abrupt edits in the speech that make it sound that the audio clip has been stitched together.

A reverse image search using screenshots from the video led us to this video, published from the official Youtube channel of Moneycontrol on July 7, 2023. A series of separate clips from the original video seem to have been lifted, looped and patched together to create the manipulated video.

The clothing and the body language of the couple in both the videos is identical. However, the board bearing the text “Project Infosys” is not visible in the original video; instead a different board with different text is visible, which suggests tampering in the backdrop visuals.

Murty speaks in English in both the videos but the audio is different. She does not refer to any financial project in the original video nor does a black screen with accompanying text — “take your chance today” — appear in that video. The original video also has brand endorsements visible at the bottom of the video frame, which are missing from the doctored video.

To discern the extent of A.I.-manipulation in the video under review, we put it through A.I.-detection tools.

The voice detection tool of Hiya, a company that specialises in artificial intelligence solutions for voice safety returned results indicating that there was a high probability that an A.I.-generated audio track was used in the video.

Screenshot of the analysis from Hiya’s audio detection tool

Hive AI’s deepfake video detection tool indicated that the video was manipulated using A.I. It pointed out markers, throughout the video, on the face of Sudha as well as her husband though he cannot be seen talking anywhere in the video. However, Hive’s audio tool did not detect the use of A.I. in the audio track.

Screenshot of the analysis from Hive AI’s deepfake video detection tool

The deepfake detector of our partner TrueMedia suggested substantial evidence of manipulation in the video. The “AI-generated insights” offered by the tool provide additional contextual analysis by stating that the audio transcript reads like a promotional script for a financial scam or get-rich-quick scheme.

The tool gave a 76 percent confidence score to “face manipulation detector”, a subcategory which detects potential A.I. manipulation of faces, as in the case of face swaps and face reenactment. The tool also gave a 37 percent confidence score to “video facial analysis”, a subcategory which analyses the video frames for unusual patterns and discrepancies in facial features.

The tool gave a 97 percent confidence score to “audio authenticity detector”, a subcategory which indicates that the audio was created by an A.I. generator or by cloning.

Screenshot of the overall analysis from TrueMedia’s deepfake detection tool
Screenshot of the A.I.-generated insights and video analysis from TrueMedia’s deepfake detection tool
Screenshot of the audio and semantic analysis from TrueMedia’s deepfake detection tool

For a further analysis on the audio, we also put it through the A.I. speech classifier of ElevenLabs, a company specialising in voice A.I. research and deployment. It returned results indicating that it was highly likely that the audio track featured in the video was generated using their software.

We reached out to ElevenLabs for a comment on the analysis. They told us that they were able to confirm that the audio is synthetic, implying that it was generated using A.I. They added that the user who broke their “terms of use” by generating the audio was subsequently blocked from their platform.

To get further expert analysis on the video, we escalated it to the Global Deepfake Detection System (GODDS), a detection system set up by Northwestern University’s Security & AI Lab (NSAIL). They used a combination of 22 deepfake detection algorithms and analyses from two human analysts trained to detect deepfakes, to review the video escalated by the DAU.

Of the 22 predictive models used to analyse the video, 3 models gave a higher probability of the video being fake, while the remaining 19 models indicated a lower probability of the video being fake.

In their report, the team noted that the speech and mouth movements of the subject are frequently misaligned. They pointed to several moments in the video where they noticed speech with no corresponding mouth movement as well as instances where the audio and mouth movements did not seem to match. The team stated that the resulting visual discrepancy could suggest that the media is inauthentic.

The team also observed several inconsistencies in the subject’s mouth and face including the microphone blending into the subject’s face, which corroborated our own observations. They concluded that the video is likely to be fake and created with artificial intelligence.

On the basis of our findings and analyses from experts, we assessed that original footage featuring Murty and her husband was used with an A.I.-generated audio track to fabricate the video. It is an attempt to associate the philanthropist with a dubious financial initiative to scam people.

(Written by Debopriya Bhattacharya and Debraj Sarkar, edited by Pamposh Raina.)

Kindly Note: The manipulated audio/video files that we receive on our tipline are not embedded in our assessment reports because we do not intend to contribute to their virality.

You can read below the fact-checks related to this piece published by our partners:

This Video of Sudha Murthy Promoting an Investment App Is AI Generated

Fact Check: Sudha Murthy Did Not Promote Investment App; Don’t Believe This Deepfake Video

Video Of Sudha Murty Promoting Investing App Is An AI Voice Clone

Viral Video Of Sudha Murty Promoting Investing App Is Not Real, But Deepfake

Sudha Murty's Viral Investment Video: A Deepfake Deception