Video Promoting Tech Project Tied to Nilekani, Ambani, and Murthy Is Fake

August 12, 2024
August 8, 2024
Manipulated Media/Altered Mediablog main image
Screengrabs of the video analysed by the DAU

The Deepfakes Analysis Unit (DAU) analysed a video featuring a trio of Indian billionaires purportedly promoting a lucrative tech-related investment project. After examining the video using A.I. detection tools and escalating it to our expert partners for assessment, we concluded that the video was created by stitching together unrelated visuals with an A.I.-generated audio track.

The two-minute-and-45-second video in English was sent to the DAU tipline for assessment. It opens with the visuals of a news presenter talking to the camera in a studio setting. The logo resembling that of an Indian English news channel is visible on the laptop placed in front of her.

The female voice accompanying the anchor’s visuals peddles the supposed project as a collaboration between Mukesh Ambani, chairperson of Reliance Industries, Nandan Nilekani, and N.R.Narayana Murthy — two of the co-founders of the tech giant Infosys. This is followed by a sequence of separate clips featuring the businessmen and a male news anchor, not from the same channel as the first anchor, promising high returns on investment in a short span of time.

The distinct voices recorded over the visuals of the businessmen have a non-Indian accent, and the delivery is robotic. The audio with the female news anchor’s clip sounds foreign in some bits and her delivery is monotonous while the male news anchor’s audio does not sound non-Indian and the delivery is close to his natural style of speaking as seen on television.

Clips of celebrities cheering at public gatherings, long queues, cash sorting machines, and people operating computers have been stitched in at various points in the video.    

In the close-ups of Mr. Ambani, we noticed awkward angles of his lower teeth and a lag between the movements of his lips and the corresponding speech. The lip movements of Mr. Nilekani were mostly consonant with the audio track during the 14 seconds of his close-ups but not when the camera wasn’t zoomed on his face.

The lip movements of Mr. Murthy seem consonant with the audio during his close-ups, but for a three-second segment where a strange quivering around his lower lip appears to be compressing the top of the microphone he is holding. There’s also a visible dissonance between the visuals and the audio in the portions that don’t focus only on his face.

To trace the original sources of the clips featured in the video, we ran reverse image searches using screenshots from the video.

Ambani’s clip could be traced to this video published on Dec. 23, 2017 from the official Youtube channel of Jio, subsidiary of Jio Platforms Limited; Reliance owns the majority stake in the company. The visuals of cheering celebrities could also be traced to this video.

A search using keywords visible in the backdrop of Murthy’s clip led us to this video, published on July 7, 2023 from the official Youtube channel of Moneycontrol, a business news and analysis website. And Nilekani’s visuals were from this video published on Aug. 27, 2023 from the official Youtube channel of NDTV, an Indian media house.

While the backdrop, clothing, and body language of the businessmen in the original videos and their clips in the manipulated video are identical, the audio tracks are different. None of them mention anything about a collaborative venture to help Indians make easy money. The visuals of the computers, queues of people, and cash sorting machines could not be found in any of the original videos.

The male news anchor’s visuals could be traced to this video, published on April 23, 2024 from the official Youtube channel of India Today, an Indian English news channel. He talks about a political party’s manifesto and there is no mention of any money-making project. The logo of the news channel has been edited out in the manipulated video using an inset of visuals.

We were unable to trace the source of the clip featuring the anchor, however, we managed to identify her. We used a cluster of keywords in search engines including her name, affiliate media house and the businessmen featured in the video, but that did not return any results about the project being promoted through the video or anything related to it.

To discern the extent of manipulation using A.I., we ran the video through A.I. detection tools.

The voice detection tool of Loccus.ai, a company that specialises in artificial intelligence solutions for voice safety returned results which indicated that there is a high probability that an A.I.-generated audio track was used in the video.

Screenshot of the analysis from Loccus.ai’s audio detection tool

We ran the video through Hive AI’s deepfake video detection tool. It indicated that the video was indeed manipulated using A.I. and pointed out markers on faces of both Nilekani and Murthy. Their audio tool also indicated strong A.I. tampering in the audio track of the video.

Screenshot of the analysis from Hive AI’s deepfake video detection tool
Screenshot of the analysis from Hive AI’s deepfake video detection tool

We also put the video through the deepfake detector of our partner TrueMedia, their analysis suggested substantial evidence of manipulation in the video. In a breakdown of the overall analysis their tool gave a 100 percent confidence score to the subcategory of “AI-generated audio detection” and a 78 percent confidence score to “audio analysis” — both indicating the detection of synthetic audio.

The tool also gave high confidence scores of 99 percent for “face manipulation” and 73 percent for “generative convolutional vision transformer”, both these subcategories indicate A.I. manipulation in the faces featured in the video.

Screenshot of the overall analysis from TrueMedia’s deepfake detection tool
Screenshot of the audio and video analysis from TrueMedia’s deepfake detection tool

For a further analysis on the audio track featured in the video, we put it through the A.I. speech classifier of ElevenLabs, a company specialising in voice A.I. research and deployment. It returned results indicating an 84 percent probability of the audio having been generated using their software.

We reached out to ElevenLabs for a comment on their analysis. They told us that they were able to confirm that the audio is synthetic, implying that it was generated using A.I. They added that the user who broke their “terms of use” while generating the audio track has been blocked from their platform.

To get another expert to weigh in on the video, we reached out to GetReal Labs, co-founded by Dr. Hany Farid and his team, they specialise in digital forensics and A.I. detection. They confirmed that the audio was synthetic, as their analysis engine also flagged it.

They noted that there were visible signs of lip-sync deepfakes throughout the video. These inconsistencies were evident in frames featuring Nilekani and Murthy, where the audio does not appear to be in sync with the video. Lip-sync artifacts were also visible on the mouth and teeth when Ambani was speaking.

They also added that some segments featuring the main speakers were longer than a few seconds. In our observation as well this is an unusual characteristic of financial scam videos like these, where the main speakers feature only for a flash of seconds.

The team added that the presence of a grid overlay in the video made it harder to detect the lip-sync artifacts. They suggested that the overlay might have been intentional or because of recording from another screen.

Based on our observations and expert analyses, we assessed that the video featuring the billionaire-trio was fabricated using original footage and A.I.-generated audio to falsely link them to a dubious tech-related investment project.

(Written by Debraj Sarkar and Debopriya Bhattacharya, and edited by Pamposh Raina.)

Kindly Note: The manipulated audio/video files that we receive on our tipline are not embedded in our assessment reports because we do not intend to contribute to their virality.