The Deepfakes Analysis Unit (DAU) analysed two videos that apparently feature Rajat Sharma, India TV news anchor and chairperson, and celebrity doctors endorsing miraculous cures for poor eyesight and obesity. Mr. Sharma clarified in a post on X that his cloned voice is being misused to promote fake remedies. After putting the videos through A.I. detection tools and getting our expert partners to weigh in, we were able to establish that both videos carry original footage that has been manipulated with synthetic audio.
Dr. Rahil Chaudhary, an ophthalmologist, features in one of the videos, which spans 10 minutes. Dr. Deepak Chopra, an Indian-American author and alternate medicine advocate, features in the other video, which is two minutes and 20 seconds long. Both videos are in Hindi; they were escalated to the DAU by fact-checking partners for analysis.
The video with Sharma and Dr. Chaudhary has been packaged like a television news story. It opens with Sharma in a studio-like setting apparently talking to the camera. A logo resembling that of India TV is visible in the top-right corner of the video frame throughout. Text graphics in Hindi at the bottom of the video frame claim that a lemon can fix eye problems within seven hours.
A male voice recorded over Sharma’s video track announces that an Indian specialist has discovered a remedy that can help fix poor vision and related ailments in “48 hours”. That voice also states that some “5,000,000 people” have been cured by it, so far.
The visuals then shift to supposed interviews with five people — four women and a man — who claim that the purported remedy improved their vision. Each clip has a different setting, and a separate audio track. The hand-held microphones used with three of these clips bear a logo identical to that of PTI, an Indian news agency; a logo resembling that of AajTak, a Hindi news channel is also visible on another microphone.
The next segment in the video features a 21-second clip of Chaudhary, who appears to be speaking into a big microphone in an indoor setup. A male voice recorded over his video track, different from the other voices heard in the video, claims that within a month 100,000 Indian citizens have been able to regain their eyesight using the supposed remedy.
That voice guarantees that the purported cure is 100 percent effective and directs viewers to click on some button to access a video which offers further details; neither the button nor the video is visible anywhere. A still image of an animation of a car, pops up in the video every few seconds and after the video runs for about one-minute-and-16-seconds the same image is visible on the screen for remaining video duration.
In October, the DAU had debunked a fabricated video in which Chaudhary was seemingly promoting the exact same supposed remedy to treat poor eyesight as is being peddled through this video. That video had featured Chaudhary with another news anchor.
The second video that we are debunking in this report is also packaged like a television news story and it too opens with Sharma apparently speaking to the camera from a studio-like setup. The male voice with his video track announces a supposed breakthrough weight loss remedy which can take effect within 24 hours. It claims that common kitchen ingredients can be used to concoct the purported treatment.
The next segment in the video features Dr.Chopra, who seems to be speaking to the camera from a casual setup like that of a home office. A male voice accompanying his video track, distinct from the one with Sharma’s video track, claims that the supposed remedy is safe to consume and can help shed one-two kilograms every 17 hours. It also promises that obesity and several related ailments can be cured without any exercise.
That voice urges viewers to click on a button to be able to view another short video, which supposedly has a limited viewing window and will be accessible to only 10 people. However, there is no visible clickable button or link for the other video. The language used with this audio track is not conversational and is difficult to understand.
A logo resembling that of India TV spelt in English and Hindi alternates in the top right corner of the video frame and the words “Aaj ki baat” written in Hindi appear in the top left corner of the video frame. In between a banner graphic carries text in Hindi which mentions that most Indians do not know about the supposed remedy; graphics at the bottom highlight its efficacy and packed with it is yet another logo that reads “SPEED NEWS”.
The transcript of the audio also runs across the bottom of the frame as the video plays. A graphic of a timer, which remains unchanged throughout, is visible in the bottom right of the video frame; below that the address of the India TV website seems to appear.
In this video at least two random still images pop up multiple times throughout. One of those is a collage of elephant art and the other one depicts a single elephant and carries a stylised logo bearing the words “India Taj”.
In both the videos the lip-sync of Sharma is imperfect, it’s the same for Chaudhary and Chopra in their respective videos. The dentition of the three men is inconsistent; the area below the chin and around the neck for all of them looks blurry in their respective clips. Another similarity is that even when there is a break in the audio accompanying their respective video tracks the mouth of the subjects appears to move.
Chaudhary’s visuals also have blurry cheeks and odd eye movements. A quivering lower lip along with blur around the lower jaw is visible in the frames featuring Chopra.
The voice attributed to Sharma in both the videos sounds similar to his voice when compared with his recorded videos available online. It is also marked by the pauses and accent characteristic of his style of delivery, however, it sounds scripted and devoid of any natural intonation. In the video that features Sharma and Chaudhary, a slight noise can be heard in between sentences in the audio with Sharma’s clip; this seems to be an attempt to mimic the human breath.
On comparing the audio being attributed to Chaudhary with his recorded interviews, the accent sounds somewhat similar but the difference in the voice quality and delivery is evident; it also lacks natural pauses.
The depth in the voice with Chopra’s clip and the pauses captured in it make it sound similar to his voice, as heard in his recorded interviews. But the monotone is not peculiar to his manner of speaking.
The video featuring Sharma and Chaudhary also has supposed interviews of five people, as mentioned above. The lip-sync for all of them is imperfect and their voices lack natural intonation. A grammatical oddity in the audio track of the only man among the five is the reference to the self as female.
We undertook a reverse image search using screenshots from the videos under review to trace the origin of the different clips seen in those.
For the video that features Sharma and Chaudhary, Sharma’s clip was traced to this video published on Aug. 20, 2024 from the official YouTube channel of India TV. Chaudhary’s clip was traced to this video published on March 29, 2024 from the official YouTube channel of Ranveer Allahbadia, a popular content creator.
The clips of the five other people in the video could not be found in either of the videos we traced.
The clips of Sharma and Chopra used in the other video have also been taken from two separate videos. Sharma’s clip was lifted from this video published on June 11, 2024 from the official YouTube channel of India TV. Most of Chopra’s video track seems to have been taken from this video published on Oct. 9, 2024 from the official YouTube channel of The Chopra Well, which promotes content featuring Chopra. However, there are at least three other videos of Chopra posted from the same channel that show him in the same outfit and backdrop, some snippets may have been lifted from any of those videos too.
The clothing, backdrop, and body language of Sharma, Chaudhary, and Chopra in their respective clips in the videos we reviewed match with those in the videos we located. Neither of the subjects mention anything about any unscientific remedy in the original videos nor do these videos carry the text graphics visible in the doctored videos. The logos visible in the original videos are also missing from the manipulated videos.
The video frames in Sharma’s clips in both the manipulated videos appear more zoomed in. A laptop visible in the foreground, in both original videos, has been edited out in the doctored videos.
Chopra’s audio in the original video is in English, Chaudhary’s alternates between Hindi and English; and Sharma speaks in Hindi in the original videos.
To discern the extent of A.I.-manipulation in the videos under review, we put them through A.I.-detection tools.
The voice tool of Hiya, a company that specialises in artificial intelligence solutions for voice safety, indicated that there is an 85 percent probability that an A.I.-generated voice was used in the video featuring Sharma and Chaudhary.
For the video featuring Sharma and Chopra, the tool returned results indicating that there is a 99 percent probability of the audio track being A.I.-generated.
Hive AI’s deepfake video detection tool indicated that the video featuring Sharma and Chaudhary was manipulated using A.I. However, the tool traced evidence of manipulation only in Chaudhary’s frames.
For the other video featuring Sharma and Chopra, the tool did not indicate any A.I. manipulation in the video track.
Hive AI’s audio detection tool found evidence of A.I. manipulation in most of the audio track being attributed to Sharma and Chaudhary but for the last 10 seconds. In comparison, evidence of A.I. manipulation was found throughout the audio track being attributed to Sharma and Chopra.
The deepfake detector of our partner TrueMedia indicated substantial evidence of manipulation in the video featuring Sharma and Chaudhary. The “A.I.-generated insights” from the tool suggest that it is an exaggerated or fabricated piece that uses promotional language and lacks scientific evidence.
In a breakdown of the overall analysis, the tool gave a 99 percent confidence score to the “face manipulation detector” subcategory, which detects potential A.I. manipulation of faces, as in the case of face swaps and face reenactment. The tool also gave a 56 percent confidence score to the “video facial analysis” subcategory, which analyses video frames for unusual patterns and discrepancies in facial features.
The tool gave a 100 percent confidence score to the “A.I.-generated audio detector” subcategory that detects A.I.-generated audio. It gave an 83 percent confidence score to “voice anti-spoofing analysis”, a subcategory that analyses audio for evidence that it was created by an A.I.-audio generator.
The tool detected substantial evidence of manipulation in the video featuring Sharma and Chopra as well. The “A.I.-generated insights” from the tool suggest that the promotional language, lack of scientific backing and mention of urgency make the video transcript sound like a scam advertisement.
The tool gave a 99 percent confidence score to the subcategory of “face manipulation detector”. It gave a confidence score of 100 percent to the subcategories of “A.I.-generated audio detector” and “voice anti-spoofing analysis”.
For a further analysis on the audio tracks we also put them through the A.I. speech classifier of ElevenLabs, a company specialising in voice A.I. research and deployment. The tool returned results indicating that it was likely that the audio tracks used in both the videos were generated using their platform.
To get another expert to weigh in on the videos, we escalated these to our partner GetRealLabs, co-founded by Dr. Hany Farid and his team, they specialise in digital forensics and A.I. detection.
The team said that the evidence strongly suggests that the video featuring Sharma and Chaudhary contains synthetic, A.I.-generated material. They added that voice analysis techniques on all the featured speakers in the video indicate that the audio is synthesised, likely created using ElevenLabs.
They ran a reverse image search against every subject, but could only trace the original video for Chaudhary’s clip. They added that the original video does not appear to be selling a product or pushing misinformation.
They also pointed to inserted cartoon frames at keypoints in the video, something we observed in our initial analysis as well, and assessed that it may be an attempt to disrupt face detection algorithms that only operate on keyframes. They added that the cartoon image used to extend the video’s length to 10 minutes may be intended to bypass automated filters that have known maximum time limits.
For the video featuring Sharma and Chopra also they observed that the evidence strongly suggests that it contains synthetic, A.I.-generated material. They added that multiple analysis techniques suggest that the audio track has been synthesised and that the mouth movements are not natural. They believe the voices of both subjects were likely created using ElevenLabs.
They also pointed to the close cropping of the background to remove temporal identifiers, the time shown in the bottom banner not changing throughout the runtime of the video; and the reuse of the bottom banner in a similar video found on Facebook.
They found that the video contains frames of three different illustrations of elephants that are repeated at fixed intervals, and speculated that these may have been inserted in an attempt to bypass content moderation filters.
They ran a reverse image search and pointed to the same original videos that we were able to locate. They added that neither of them included discussions of the weight loss program in the doctored video, as we pointed out in our analysis above.
To get another expert analysis on the videos, we escalated them to the Global Deepfake Detection System (GODDS), a detection system set up by Northwestern University’s Security & AI Lab (NSAIL).
Both videos were analysed by two human analysts and run through 22 deepfake detection algorithms. In addition the team used seven deepfake detection algorithms to analyse the audio track in the video featuring Sharma and Chaudhary.
For the Sharma and Chaudhary video, five models out of 22 gave a higher probability of the video being fake, while the 17 remaining models indicated a lower probability of the video being fake. For the audio track, six of the seven models indicated a higher probability of the audio being fake, while the one remaining model indicated a lower probability of the audio being fake.
In their report, the team echoed our observations about a recurrent cartoon tractor image flashing in the video. They noted that it does not align with its news-like content, indicating the possibility of the video being manipulated. Their observations on the visual oddities visible in the faces of Sharma and Chaudhary, corroborated our analysis.
They also pointed out that in various instances in the video, Sharma’s glasses were appearing to merge into his face, and added that blurring can be used to conceal visual manipulations. In Chaudhary’s segment they pointed to his unnatural mouth distortions, which they said appears almost as a “visual glitch”.
For the video featuring Sharma and Chopra, six of the 22 models gave a higher probability of the video being fake, while the remaining 16 models indicated a lower probability of the video being fake.
The team identified visual artifacts in the video which they noted are common among manipulated media samples. They highlighted instances of Chopra’s face changing shape and his hands blurring into his chin; and Sharma’s glasses blending into his face and unnatural shadows forming around his shirt collar. The team’s observations on the mouth movements and the audio track corroborate our analysis.
Echoing our observations, they also pointed to recurrent flashes of illustrated elephant images, suggesting that these could have been included to obscure manipulations.
In the overall verdict, the GODDS team concluded that the two videos are likely fake and generated or manipulated with artificial intelligence.
On the basis of our findings and analysis from experts, we can conclude that original footage was used with synthetic audio to fabricate both videos. There seems to be a continued attempt to misinform and scam people through content that promotes unscientific remedies. Public figures and medical practitioners are falsely linked to such content to create the impression that they are backing it. The DAU has also debunked manipulated videos that have featured Sharma with celebrities and doctors such as Amitabh Bachchan and Dr. Bimal Chajjer.
(Written by Debopriya Bhattacharya and Debraj Sarkar, edited by Pamposh Raina.)
Kindly Note: The manipulated audio/video files that we receive on our tipline are not embedded in our assessment reports because we do not intend to contribute to their virality.
You can read below the fact-checks related to this piece published by our partners:
Fact Check: Video Claiming Lemon Water Can Cure Vision Problems In 7 Minutes Is AI-generated