The Deepfakes Analysis Unit (DAU) analysed a video featuring Indian Finance Minister Nirmala Sitharaman and Reserve Bank of India (RBI) Governor Shaktikanta Das, apparently promoting an investment platform, supposedly endorsed by the government. The RBI has issued a statement clarifying that the organisation is not involved in any such initiative and has urged the public not to fall for such fake videos. After running the video through A.I. detection tools and getting our expert partners to weigh in, we were able to conclude that the video was manipulated using A.I.-generated audio.
The two-minute-and-43-second video in English was escalated to the DAU by a fact-checking partner for analysis. The video opens in a studio-like setting with a female anchor speaking to the camera. Her backdrop seems to have a screen with an image of Ms.Sitharaman projected on it. A female voice recorded over her video track announces that Sitharaman has accused Mr.Das and the Indian government of withholding information from Indian citizens regarding a lucrative income-generating platform.
The next segment in the video shows Sitharaman at what seems like a public event. She can be seen holding a microphone in her hand which bears a logo resembling that of CNN-News18, an English-language television news channel in India. A female voice recorded over her video track alleges that the Indian government and Das are being secretive about the supposed financial platform because of some “personal issues” with her.
The anchor makes a brief appearance to introduce a supposed statement from Das. A clip of Das that follows, shows him speaking at a public event. Given the different backdrop in his clip and the one featuring Sitharaman it does not appear that the two clips were recorded at the same event.
The male voice with Das’s video track justifies the secrecy and declares that the supposed platform was being tested and has been deemed highly profitable. Adding that, it has even received government support.
That voice suggests that the project is tech-driven and linked to A.I.-based algorithms for automated cryptocurrency trading and it can be accessed on any device. Viewers are urged to invest their money by registering with a supposed official website of the project. The video fades into a black screen with capitalised words indicating that there is a link below — it’s not visible anywhere in the video.
The video has been packaged as a news story. However, there is no visible media house logo in any of the video frames. Text graphics at the bottom of the video promise investors huge returns for a relatively small amount of money, and a “breaking news” super in bold letters can be seen above that.
Of the three subjects, Das’s lip movements appear most synchronised with the accompanying audio track. Sitharaman’s as well as the anchor’s lip-sync is off the mark and in a few instances, especially in the minister’s case, her lips appear to be moving even as the audio stops.
Sitharaman’s segment is marked by her dentition resembling a white patch. The contours of her teeth appear to be missing, her upper set of teeth seem to disappear and reappear across frames. The segment featuring her ends with a clip that seems to have been used in a loop. We noticed a jerk in that segment after which her hand movements look identical to those seen in the previous few seconds.
The female voice recorded over Sitharaman’s video track sounds somewhat like hers, it tries to convey her characteristic accent. However, it lacks the diction and pitch as heard in her recorded speeches and interviews.
In the segment featuring Das, the shape of the microphone placed in front of him distorts each time his chin gets close to it and the contours of his jawline seem to change simultaneously. His teeth are slightly blurred in some frames and his lower set of teeth appear unnaturally elongated for a brief moment.
The voice with Das’s video track sounds somewhat like his and even attempts to mimic natural pauses from human speech. However, the accent sounds slightly foreign, the delivery is hastened and scripted without change in tone or pitch, and the voice quality is not as deep when compared with that heard in his recorded interviews and speeches.
The audio with the anchor’s video track sounds scripted. It has a monotone and an odd foreign accent; the pronunciation of Sitharaman’s and Das’s names sounds particularly peculiar.
Reverse image search using screenshots from the video led us to three different sources and helped establish that the video we reviewed is composed of unrelated clips that have been spliced. We were unable to locate the source for the clip featuring the anchor. The poor video quality seems to have impacted the reverse image search results in this case.
Sitharaman’s segment led us to this video, published from the official YouTube channel of CNN-News18 on Sept. 16, 2024. Das’s segment led us to this video published from the official YouTube channel of RBI on Oct. 9, 2024. The clothing, backdrop, and body language of Das and Sitharaman in these videos and the video we reviewed are identical.
Das is speaking in English in the original video while Sitharaman can be heard speaking in both Hindi and English in the original video; none of them mention anything about a financial investment platform.
The doctored video seems to have used zoomed in frames from the original videos. There are no text graphics in either of the original videos. A news channel logo is visible in Sitharaman’s original video at the top right-corner of the video frame but there is no official logo in Das’s original video.
To discern the extent of A.I.-manipulation in the video under review, we put it through A.I.-detection tools.
The voice tool of Hiya, a company that specialises in artificial intelligence solutions for voice safety, indicated that there is a 74 percent probability that the audio track in the video was A.I.-generated.
Hive AI’s deepfake video detection tool indicated markers only on the face of Sithraman, pointing to A.I.-manipulation in her visuals. Their audio detection tool traced A.I. manipulation in the segment carrying the audio being attributed to the anchor and Sitharaman but not in the audio with Das’s video track.
The deepfake detector of our partner TrueMedia suggested substantial evidence of manipulation in the video. The A.I.-generated insights offered by the tool provide additional contextual analysis by stating that the audio transcript reads like an online scam, making exaggerated claims with promotional language. The insights also highlight that it's highly improbable that a senior government official would publicly accuse another of lying.
In a breakdown of the overall analysis, the tool gave a 98 percent confidence score to the “deepfake face detector” subcategory that uses a visual detection model to detect faces and to check if they are deepfakes. The subcategories of “face manipulation detector” and “video facial analysis” received confidence scores of 94 percent and 59 percent respectively. The former detects potential A.I. manipulation of faces, as in the case of face swaps and face reenactment; and the latter checks video frames for unusual patterns and discrepancies in facial features.
The tool gave a 100 percent confidence score to the “A.I.-generated audio detector” subcategory that detects A.I.-generated audio. It gave a 68 percent score to the “audio authenticity detector” subcategory and a 59 percent score to the “voice anti-spoofing analysis” subcategory; both subcategories analyse whether the audio was generated by an A.I. audio-generator or cloning.
For a further analysis on the audio track we also put it through the A.I. speech classifier of ElevenLabs, a company specialising in voice A.I. research and deployment. The tool returned results indicating that it was likely that the audio track used in the video was generated using their platform.
To get an expert analysis on the video, we escalated it to the Global Deepfake Detection System (GODDS), a detection system set up by Northwestern University’s Security & AI Lab (NSAIL). They used a combination of 22 deepfake detection algorithms and analyses from two human analysts trained to detect deepfakes to review the video.
Of the 22 predictive models used to analyse the video, two gave a higher probability of the video being fake and the remaining 20 indicated a lower probability of the video being fake.
In their report the team noted that the facial features of the speakers are consistently blurry. They specifically pointed to an instance where the anchor’s visuals seemingly merge with the backdrop to indicate that as a possible sign of manipulation in the video.
They also pointed to a moment in the video when Das’s lips move unnaturally fast. Their observations on the speakers’ accents and delivery style corroborated our analysis. The team concluded that the video is likely fake and generated with artificial intelligence.
To get another expert to weigh in on the video, we escalated it to our partner GetRealLabs, co-founded by Dr. Hany Farid and his team, they specialise in digital forensics and A.I. detection.
The team said that evidence strongly suggests that the video contains synthetic, A.I.-generated material. They added that multiple techniques, such as spectrogram and lip-sync analysis, indicated that the audio track had been synthesised and that the mouth movements are not natural.
They believe that at least one of the voice clones, that of the anchor, was likely created using ElevenLabs. They noted that the background was closely cropped to remove temporal identifiers, and the static banner at the bottom throughout the video is unlike most modern news reports.
They ran a reverse image search and pointed to the same original videos that we were able to locate. They added that neither of the original videos included discussions of the investment platform, as we too had noted in our analysis above.
On the basis of our findings and expert analyses, we can conclude that the video featuring Sitharaman and Das was manipulated using A.I.-voice to mislead the public with a non-existent, government-backed, financial investment program promising high returns.
(Written by Debraj Sarkar and Debopriya Bhattacharya, edited by Pamposh Raina.)
Kindly Note: The manipulated video/audio files that we receive on our tipline are not embedded in our assessment reports because we do not intend to contribute to their virality.
You can read below the fact-checks related to this piece published by our partners:
Clip of Nirmala Sitharaman, RBI Governor Promoting Investment App Is a Deepfake
Deepfake Video Of Nirmala Sitharaman and Shaktikanta Das Promoting Investment Scheme Goes Viral
Video Of Nirmala Sitharaman, RBI Governor Promoting Investment Project Is AI Voice Clone