The Deepfakes Analysis Unit (DAU) analysed two videos apparently packaged as exit polls from leading Indian news channels projecting victory for Mahabal Mishra, a candidate fielded by the Aam Aadmi Party (AAP) from the West Delhi Lok Sabha constituency. One of the anchors has clarified on X , formerly Twitter, that it was not his voice in the video, but a synthetic voice that was used to doctor a video featuring him. After examining the videos using A.I. detection tools, we concluded that A.I.-generated audio tracks were added to the visuals in both videos.
We received the 26-second video featuring Sudhir Chaudhary, an anchor with a Hindi news channel, on the DAU tipline, and a 41-second video featuring Manak Gupta, also an anchor with a Hindi news channel, through a fact-checking partner for verification. Both clips in Hindi appear to bear logos that resemble the logos of the news channels that Mr. Chaudhary and Mr. Gupta are associated with, individually.
Each video opens with the news anchors and then moves onto other visuals with a voice in the background reinforcing that Mr. Mishra is slated to trump his opponents in the election for the West Delhi constituency, the voting for which started at 7:00 AM on May 25, the penultimate phase of polling for the ongoing seven-phased general election in India.
In the video featuring Chaudhary some graphics with numbers labelled as “exit polls” are visible on the left with those numbers supposedly indicating the trends for seven Lok Sabha seats being contested from Delhi. On the right side of the screen visuals constantly keep changing, Chaudhary’s appearance in the initial few seconds is followed by brief clips from campaign trails of AAP leader and Delhi Chief Minister Arvind Kejriwal, Bharatiya Janata Party (BJP) candidate Kamaljeet Sehrawat contesting from West Delhi, and Mishra.
In portions of the video that feature Chaudhary’s narration, the audio is not at all synchronised with his lip movements, the delivery sounds very robotic with no change in pitch or tone, uncharacteristic of the way he is known to speak.
In the video featuring Gupta, after a few seconds of his appearance, only a series of graphics are visible on the screen that throw numbers indicative of Mishra’s victory without the anchor’s face or any other visuals. The anchor’s lip movement and his audio in the initial few seconds do not align at all. The voice sounds like Gupta’s but the delivery is monotonous and sounds synthesised.
We ran a reverse image search using screenshots from the portions of the videos that featured only the anchor persons.
Screengrabs of Chaudhary led us to this video uploaded on May 16, 2024 from the official Youtube channel of Aaj Tak, a Hindi news channel. In that video and the manipulated video, the setting, clothes, and the body language of the anchor are exactly the same, but the visuals of the leaders or the numbers labelled as “exit polls” are completely absent.
Screengrabs of Gupta led us to this video which was posted on March 29, 2024 from the official Youtube channel of News24, a Hindi news channel. Analysing the clothing, body language, and backdrop of the anchor, we could surmise that this was the original from where snippets were taken to produce the fake video. The graphics predicting the poll outcome are not there in the original video.
Our investigation helped us establish that the videos are a patchwork of unrelated clips with narration added on top of them. To discern whether the fabricated audio in both videos was produced using generative A.I. we put the videos through A.I. detection tools.
For the video featuring Chaudhary, Hive AI’s deepfake video detection tool indicated A.I. manipulation in extended portions of the entire video and their audio tool also caught bits of A.I. in the audio.
We also used TrueMedia’s deepfake detector, which overall categorised the video as “highly suspicious”, suggesting high probability of manipulation in the video. It gave a 100 percent confidence score for the subcategory of “A.I. generated audio detection”, which is an indicator for synthetic speech. The tool also gave very high confidence scores for visual manipulation with more than 90 percent for the subcategories of “deepfake face detection” and “face manipulation”.
For the video featuring Gupta, Hive AI’s deepfake detection tool indicated A.I. manipulation in the close-up of the anchor’s face and their audio tool also caught bits of generated speech in the audio.
We also used TrueMedia’s deepfake detector, which overall concluded that there was a very high probability of manipulation in the video, and gave 95 percent confidence score for “A.I. generated audio detection”. The tool also gave a 90 percent confidence score for “face manipulation” suggesting visual tampering using A.I.
On the basis of our findings, and the comment from one of the anchors on X, we can conclude that the voices of the anchors’ as heard in the videos aren’t theirs and that they have been produced using generative A.I.
(Written by Debraj Sarkar with inputs from Debopriya Bhattacharya and Areeba Falak, and edited by Pamposh Raina.)
Kindly Note: The manipulated audio/video files that we receive on our tipline are not embedded in our assessment reports because we do not intend to contribute to their virality.
You can read below the fact-check related to this piece published by our partners:
वोटिंग से पहले दिल्ली में इंडिया गठबंधन को बढ़त का अनुमान लगाते ये वीडियो एडिटेड हैं