Video of Anjana Om Kashyap, Dr. Rahil Chaudhary Promoting Cure for Poor Sight Is Fake

October 31, 2024
October 30, 2024
Manipulated Media/Altered Mediablog main image
Screengrabs of the video analysed by the DAU

The Deepfakes Analysis Unit (DAU) analysed a video featuring Anjana Om Kashyap, anchor with a Hindi news channel, and Dr. Rahil Chaudhary, an ophthalmologist, apparently endorsing a home remedy to fix poor eyesight. After running the video through A.I. detection tools and getting our expert partners to weigh in, we were able to conclude that the video was manipulated using A.I.-generated audio. 

The four-minute-and-17-second video in Hindi was escalated to the DAU by a fact-checking partner for analysis. The video opens in a studio-like setting with Ms. Om Kashyap speaking to the camera. A female voice recorded over her visuals announces that an Indian doctor has discovered a miraculous cure which can restore eyesight in five minutes and benefit lakhs of visually impaired people.

The next segment in the video features Dr. Chaudhary, speaking at a press conference apparently. A male voice accompanying his visuals, talks about the supposed cure, the basis of which seems to be an unscientific home remedy. That voice goes on to suggest that commonly available kitchen ingredients such as lemons can be used to concoct the supposed medicine.

A logo which closely resembles that of Aaj Tak, a Hindi news channel, is visible in the top left corner of the video frame. This logo, however, bears additional text also in Hindi inside the frame, which makes it read as “Crime Aaj Tak news”. Supers endorsing the supposed remedy are visible throughout the video, and the words “News 24” and “Live” appear at the bottom of the video frame on left and right side, respectively. An image of a Snellen chart — chart with letters used for testing eyesight — flashes several times in the video.

Toward the end of the video, the voice attributed to Chaudhary suggests that a separate three-minute video offers details about accessing the remedy but no link for that video is shared. A sense of urgency is conveyed by that voice as it claims that the video would be taken down due to complaints from pharmaceutical companies, and would be accessible for only 15 minutes. The video abruptly ends after that.

In the 26 seconds that feature Om Kashyap the audio and video tracks are out-of-sync. It seems that one clip has been looped together to create that segment. In some frames her chin appears to distort and her dentition looks odd. The audio accompanying her visuals can also be heard in frames where her mouth is not moving at all.

Chaudhary’s video and audio track are also out-of-sync, and audio can be heard in frames where his mouth is not moving. The same video clip seems to have been looped together in his case as well, though it appears that the order of the clip has been changed each time it is stitched with the previous clip. There are several jump cuts in the segment featuring him.

The female voice recorded over Om Kashyap’s visuals sounds somewhat like hers, however, her characteristic intonation and pitch as heard in her recorded news shows are missing from this video. The audio track with Chaudhary’s visuals bears some resemblance to his real voice, even captures his accent and pauses when compared with his recorded interviews, but sounds monotonic and scripted.

A reverse image search using screenshots from the video led us to two different sources and helped establish that the video we reviewed is composed of unrelated videos that have been spliced together.

Om Kashyap’s segment led us to this video published from the official YouTube channel of Aaj Tak on Oct. 1, 2024, which features a political debate. Chaudhary’s segment led us to this video published from the official YouTube channel of Pachouli Wellness by Dr. Preeti Seth on Jan. 30, 2024; the doctor is heard discussing surgical interventions to cure eyesight problems but there is no mention of any kind of home remedy.

The clothing, backdrop, and body language of the subjects in these two videos and their respective segments in the doctored video are identical. However, the logo and supers visible in the doctored video are missing from the original videos. The language of conversation in the original videos is also Hindi.

The frames in the doctored video are more zoomed in. The original video featuring Om Kashyap has two other panellists in the same frame as her, while in Chaudhary’s video one other person is visible in the same frame.

To discern the extent of A.I. manipulation in the video under review, we put it through A.I. detection tools.

The voice tool of Hiya, a company that specialises in artificial intelligence solutions for voice safety returned results indicating that there is a 98 percent probability that an A.I.-generated audio track was used in the video.

Screenshot of the analysis from Hiya’s audio detection tool

Hive AI’s deepfake video detection tool indicated that the video was manipulated using A.I. It pointed out many markers, but only on Chaudhary’s face in the entire runtime of the video. Their audio tool detected the use of A.I. throughout the audio track but for the last 10-seconds.

Screenshot of the analysis from Hive AI’s deepfake video detection tool

The deepfake detector of our partner TrueMedia suggested substantial evidence of manipulation in the video. The “A.I.-generated insights” offered by the tool provide additional contextual analysis by stating that the video transcript reads like a promotional pitch; full of exaggerated claims conveyed by overtly promotional language.

In a breakdown of the overall analysis, the tool gave a 66 percent confidence score to the “video facial analysis” subcategory that analyses video frames for unusual patterns and discrepancies in facial features.

Screenshot of the overall analysis from TrueMedia’s deepfake detection tool
Screenshot of the A.I-generated insights and video analysis from TrueMedia’s deepfake detection tool

The tool gave a 100 percent confidence score to the “A.I.-generated audio detector” subcategory that detects A.I.-generated audio. The tool also gave a 98 percent confidence score to the “audio authenticity detector” subcategory that analyses audio for evidence that it was created by an A.I. generator or cloning. The “voice anti-spoofing analysis” subcategory that analyses audio for evidence that it was created by an A.I. audio generator received a 92 percent confidence score.

Screenshot of the audio and semantic analysis from TrueMedia’s deepfake detection tool

We reached out to ElevenLabs, a company specialising in voice A.I. research and deployment for an expert analysis on the audio. They told us that they were able to confirm that the audio is synthetic, implying that it was A.I.-generated.

To get another expert to weigh in on the manipulations in the video, we reached out to our partners at RIT’s DeFake Project. Akib Shahriyar from the project noted that there are multiple points in the video where visual sequences are reversed and then repeated to extend a subject’s segment in the video.

Reversal of Om Kashyap’s hand gestures and fixing of her hair in reverse combined with poor lip-syncing make it clear that it is a fake, Mr. Shahriyar added, referring to the video.

He also pointed to several instances where the lip movements of both the subjects are not in alignment with their audio track causing what he identifies as noticeable “double-chin effects”.

He further noted that the underlying lip-syncing method used in the video is only generating pixels for the boxed region on the mouth; he refers to the area marked out by a bounding box for digital manipulations.

To get another expert analysis on the video, we escalated it to the Global Deepfake Detection System (GODDS), a detection system set up by Northwestern University’s Security & AI Lab (NSAIL). They used a combination of 22 deepfake detection algorithms and analyses from two human analysts trained to detect deepfakes, to review the video escalated to us.

Of the 22 predictive models used to analyse the video, five models gave a higher probability of the video being fake, while the remaining 17 models indicated a lower probability of the video being fake.

The team noted in their report that Om Kashyap’s speech is often unaligned with her mouth movements, her features are often unnaturally blurry and misaligned. They added that at several points in the video, the footage reverses, causing Om Kashyap to act backwards, suggesting that the footage has been edited.

They also added that a graphic appears several times throughout the video, interrupting the footage and that many of the visual indicators of manipulation on Om Kashyap’s visuals can be seen in Chaudhary’s footage as well.

In the overall verdict, the GODDS team concluded that the video is likely to be fake and generated with artificial intelligence.

On the basis of our findings and analyses from experts, we can conclude that the video of Om Kashyap and Chaudhary promoting a home remedy for poor eyesight is fake. Unrelated visuals were stitched together and used with synthetic audio to peddle health misinformation.

(Written by Debopriya Bhattacharya and Debraj Sarkar, edited by Pamposh Raina.)

Kindly Note: The manipulated audio/video files that we receive on our tipline are not embedded in our assessment reports because we do not intend to contribute to their virality.

You can read below the fact-checks related to this piece published by our partners:

Fact Check: Video Of Doctor Talking About Lemon Water Curing Vision Loss Is A Deepfake

Fact Check: Can lemon water cure eyesight problems in 7 minutes?

Deepfake Videos of Doctor Promoting Home Remedies For Vision Loss Goes Viral

Fact Check: घरेलू नुस्खे से मिनटों में आँखों का उपचार बताती अंजना ओम कश्यप का यह वीडियो डीपफेक है

Deceptive AI Videos Show Doctors Endorsing Miracle Cures for Critical Illnesses like Glaucoma, Hypertension, and Vision Loss