Screengrabs of the videos analysed by the DAU

The Deepfakes Analysis Unit (DAU) analysed multiple videos that separately feature Anant and Nita Ambani, business persons associated with Reliance Industries and their philanthropic arm. In the videos they seem to be encouraging people to invest in a supposed gaming app on the pretext of Anant’s marriage and promising financial rewards in return. After putting the videos through A.I. detection tools and seeking inputs from experts, we assessed that the videos were manipulated using synthetic audio to propagate a financial scam.

The DAU reviewed three videos in Hindi, each embedded in a Facebook post. Of these a 40-second video sent to the tipline and a two minute-and 24-second video shared by a fact-checking partner featured Mr. Ambani; and a 48-second video also shared by a partner featured Mrs. Ambani. 

The video featuring Mrs. Ambani has garnered more than 3.9 million views on Facebook since it was first posted on the platform on June 29, 2024. One of the videos with Mr. Ambani has been viewed almost 113,000 times on Facebook after it was uploaded on July 12, 2024. Both the videos were posted from a profile that bears the same name as that of the dubious app being promoted through the videos.

The views or the upload date for the video received on the tipline for assessment are not visible on Facebook. That video opens with visuals of a news presenter in a studio setting talking to the camera, a logo resembling that of BBC Hindi is visible on the top right corner of the video frame.

A female voice accompanying the anchor’s visuals links the introduction of the supposed gaming app to Mr. Ambani’s marriage. A male voice that follows, recorded over visuals of him apparently speaking to the camera, promises monetary returns from the app. The audio over the remaining visuals — bundles of cash, the app’s interface, Ambani at public appearances, people celebrating — oscillates between the same male and the female voices; and another male voice narrating how the app helps make quick money.

In the other video featuring Mr. Ambani, he seems to be speaking to the camera. The male voice accompanying his visuals can be heard promising a sum of one trillion rupees on the occasion of his marriage; and urging people to download the supposed gaming app owned by him to receive monetary returns irrespective of the gaming outcome. The video abruptly ends at the 14-second mark, with the rest of it being a black screen with arrow prompts to download the app.

The video of Mrs. Ambani seems like an interview to a press crew waiting outside a venue. A close-up shows her speaking into a handheld microphone bearing a logo resembling that of the news agency ANI. A cluster of other microphones bearing logos resembling those of leading Indian television channels are also visible in the video. A disembodied male voice can be heard posing questions. 

A female voice recorded over her visuals can be heard touting the same dubious app as her son Anant’s favourite app, encouraging people to play, and promising to donate one trillion rupees on the occasion of her son’s marriage. The other visuals in the video include the app’s interface, graphics showing multiple returns, and winners celebrating.

A grammatical oddity in the speech in this video is that the female voice makes a reference to the self as male. Besides, the lip movements of Mrs. Ambani are not in sync with the words that can be heard. The quality of the video is very poor with heavy pixelation around her mouth.

Of the two videos of Mr. Ambani that we analysed, the one that features only him, does not show signs of dissonance between the lip movements and corresponding words. However, in the other video the lip synchronisation of all the subjects featured is visibly imperfect, including that of Mr. Ambani, the anchor, as well as the third person. A strange quivering around Mr. Ambani's lower lip is evident in a four-second segment, that vibration appears to be compressing the microphone held very close to his mouth.

In all the three videos the intonation of the subjects sounds very robotic, scripted, and devoid of emotion. The overall delivery of the Ambanis is different from that observed in their public speeches. 

We undertook a reverse image search using screenshots from the videos to find the origin of the various clips interspersed in those.

We were able to trace the news anchor’s visuals to this video published by the official Youtube channel of BBC News Hindi on May 7, 2024. She talks about Gaza in the news story and there is no mention of the Ambanis. An image of Mr. Ambani with a representation of the app is visible as an inset in the backdrop of the manipulated video but it’s not present in the original video. That image too is manipulated and has been lifted from the photo seen here. The other clips could be traced to this interview of his to a television channel, which does not mention any app, and his  pre-wedding celebrations

The visuals for the second clip featuring Mr. Ambani were lifted from a press conference that he addressed in Feb. 2024, it mentions nothing about a gaming app. Mrs. Ambani’s clips were from the press coverage of her visit to a temple in Varanasi in June, 2024.

While the backdrop, clothing, and body language of the subjects, in the original and manipulated videos are identical, the audio tracks are not.  The visuals of the dubious app, bundles of money, and the people celebrating could not be found in any of the original videos. Neither do the Ambanis mention anything about donating a trillion rupees. 

We wanted to discern if A.I. had been used to manipulate the videos, for which we put them through A.I. detection tools. 

The voice detection tool of Loccus.ai, a company specialising in artificial intelligence solutions for voice safety, returned results which indicated that there’s a high probability that synthetic audio was used in the video received on the tipline, which featured Mr. Ambani and the anchor. For the other two videos, the tool gave a high probability of the audio track being real, which could be because there was background noise in both the videos and that can impact the results from a tool.

Screenshot of the analysis from Loccus.ai’s audio detection tool for Ambani’s video received on the tipline 

For both the videos featuring Mr. Ambani, Hive AI’s deepfake video detection tool did indicate that the videos were manipulated using A.I. However, the tool did not indicate any signs of manipulation using A.I. in the video featuring Mrs. Ambani.

Hive’s audio detection tool indicated tampering using A.I. in the audio tracks for all three videos.  

Screenshot of the analysis from Hive AI’s deepfake video detection tool

Screenshot of the analysis from Hive AI’s deepfake video detection tool

Screenshot of the analysis from Hive AI’s deepfake video detection tool

For further assessment, we put the videos through TrueMedia’s deepfake detector, which suggested substantial evidence of manipulation in all three videos. The confidence scores that returned indicated the use of synthetic audio as well as manipulations using A.I. in the facial features of the subjects, across the three videos.  

Screenshot of the overall analysis from TrueMedia’s deepfake detection tool
Screenshot of the audio and video analysis from TrueMedia’s deepfake detection tool

Screenshot of the overall analysis from TrueMedia’s deepfake detection tool
Screenshot of the audio and video analysis from TrueMedia’s deepfake detection tool

Screenshot of the overall analysis from TrueMedia’s deepfake detection tool
Screenshot of the audio and video analysis from TrueMedia’s deepfake detection tool

To get an expert analysis on the audio tracks in the videos we escalated the videos to ElevenLabs, a company specialising in voice A.I. research and deployment. They told us that they could not confirm that the audio track of the video featuring Mr. Ambani and the anchor originated from their platform. However, they were able to confirm that the audio in the other video of Mr. Ambani is synthetic, implying that it was generated using A.I. 

ElevenLabs also told the DAU that the audio track in the video featuring Mrs. Ambani is also synthetic. They added that they have blocked the user who broke their “terms of use” while generating the audio from their platform.

While we were confident that synthetic audio had been used in the three videos, we wanted to get an expert view on one of Mr. Ambani’s videos in which his lip-sync seemed convincing. We shared the same with our partner GetRealLabs, co-founded by Dr. Hany Farid and his team, they specialise in digital forensics and A.I. detection.

They confirmed it to be a case of a lip-sync deepfake. They noted that the audio was synthetic and the lip-sync was quite good, making it hard to detect. Pointing to the visual artifacts in the video, they added that the teeth of Mr. Ambani are distinctive in real life but less so in this manipulated video.

The Ambanis have been falsely linked to dubious gaming apps through several videos in recent months, the DAU has debunked some of those videos, which had been produced using synthetic audio.  

Based on our observations and analyses from experts, we can conclude that the three videos we reviewed were fabricated using generative A.I. It was a malicious attempt to associate the Ambani family with a dubious product which they neither own nor endorse.

(Written by Debraj Sarkar, Debopriya Bhattacharya, and edited by Pamposh Raina.)

Kindly Note: The manipulated audio/video files that we receive on our tipline are not embedded in our assessment reports because we do not intend to contribute to their virality.

You can read below the fact-checks related to this piece published by our partners:

Deepfake Ads Of Nita Ambani, Anant Ambani, Yogi Adityanath, Gautam Adani Endorsing Shady Gaming App Go Viral

Videos of Anant, Nita Ambani & Virat Kohli ‘Promoting’ Gaming Apps Are Deepfakes