Videos of Naresh Trehan, Devi Shetty Promoting Cure for Hypertension, Diabetes Are Fake

December 26, 2024
December 20, 2024
Manipulated Media/Altered Mediablog main image
Screengrabs of the videos analysed by the DAU

The Deepfakes Analysis Unit (DAU) analysed two videos apparently featuring India’s leading cardiac surgeons, Dr. Naresh Trehan and Dr. Devi Prasad Shetty. In one video the former seems to be promoting a remedy for hypertension, and in the other the latter is seen offering a cure for diabetes.  After putting the videos through A.I. detection tools and getting our expert partners to weigh in, we were able to conclude that synthetic audio was used to fabricate the videos.

The Delhi High Court recently issued an injunction to protect Dr.Shetty’s personality rights and his registered trademark against unauthorised use. He is the founder and chairperson of Narayana Health, a private hospital network in India. The Medanta group, led by Dr.Trehan, issued a statement last month in response to the misrepresentation of the doctor’s voice to promote a supposed cure for Prostatitis. The DAU debunked a video carrying such content by establishing that it was produced using synthetic voice.

The two videos in Hindi being linked to the doctors were escalated to the DAU by fact-checking partners for analyses. One of those is a two-minute-and-29-second video that apparently features Trehan and the other is a one-minute-and-53-second video that purportedly features Shetty. Each video seems to promote a miraculous cure using words and expressions that sound less conversational, more bombastic.

In the video featuring Trehan, he appears to be talking to the camera in an office-setting with a blurred backdrop that seems to have a computer screen and some books. A male voice recorded over his video track claims that a 30-second routine at home can cure hypertension without any medication. Graphics across the top of the video frame suggest that a "saltwater remedy” can stabilise blood pressure in “seven minutes”. Captions at the bottom of the video frame offer a transcript of the accompanying audio.

The video comprises short clips of Trehan interspersed with unrelated visuals such as those of hospital and laboratory settings, a family enjoying a meal, outdoor settings; and graphics of molecules and cross-sections of blood vessels. Most supposed patients appear to be westerners. The audio track also mentions that more than “50,000 Greeks” have benefited so far from the purported remedy. The video abruptly ends with the same voice urging people to click on some button to access a separate video which supposedly offers further details; neither the button nor the video is visible anywhere.

The video featuring Shetty shows him speaking to the camera seated on a high-back chair. He is wearing scrubs, indicating that the video was perhaps recorded in his office. The male voice recorded over his video track claims that a home remedy can treat diabetes and related health conditions permanently within “33 hours”. However, graphics across the top of the video frame suggest “37 hours” instead. In this video too captions at the bottom of the video frame offer a transcript of the accompanying audio.

This video also carries unrelated visuals including those of a concoction being prepared, elderly westerns either receiving some treatment or enjoying outdoor settings, and an insulin bottle and a syringe; all these have been stitched together with clips of Shetty.

The voice in the video goes on to claim that Metformin, a medicine commonly used to treat diabetes and other health conditions is harmful and that the supposed new remedy is superior to conventional treatments. It also claims that on Oct. 31, 2024 Shetty made a breakthrough in achieving perfect control over diabetes, which apparently convinced the Indian Health Ministry to endorse the treatment and offer it through some government-backed scheme.

The video ends with the voice suggesting that the government has created some quota for diabetes patients which allows them to access another video that warns about Metformin use and offers a permanent cure for diabetes. However, that video or the link to it is not visible anywhere just like the case with the video supposedly featuring Trehan.

The fonts, color scheme, and placement of the graphics is almost identical in both the videos.

In the video supposedly featuring Trehan, his lip movements are mostly out-of-sync with the accompanying audio track. His lips appear to move even when there is a pause in the audio. The mouth area does not seem to move in tandem with the rest of facial muscles. His teeth are blurred; in some frames a brown patch is visible between the lips and in other frames one can vaguely spot the outline of a few teeth.

The voice in the audio track sounds somewhat like Trehan’s when compared with his recorded interviews. However, it has a hastened and nasal quality, lacking intonation. There is a distinct noise, resembling the sound of human breathing, which can be heard when there is a pause between sentences.

In the supposed Shetty video, the segments featuring him are marked by quivering around his mouth region and oddly shaking teeth. In some frames, an additional set of teeth seem to appear as his lips move. The audio track and lip movements look mostly synchronised.

The voice in the audio track sounds similar to that heard in his recorded interviews but the pitch is different, the delivery lacks his characteristic pauses and sounds scripted. The voice seems to have difficulty pronouncing numbers in Hindi.

We undertook a reverse image search using screenshots from the videos under review to trace the origin of the clips interspersed in those.

Trehan’s clips could be traced to this video published on March 17, 2020 from the official YouTube channel of Hindustan Times, an Indian English-language newspaper. Shetty’s clips could be traced to this video published on Sept. 29, 2022 from the official YouTube channel of Narayana Health.

The clothing, backdrop, and body language of Trehan and Shetty in the two videos we reviewed and the videos we were able to locate are identical. The audio tracks are totally different in both sets of videos. They are speaking in English in their respective original videos, and there is no mention of any home remedy to cure hypertension or diabetes in either video or any graphics suggesting so.

The doctored videos seem to have used zoomed in frames from the original videos, cropping off the backdrop, logos, and names from the original videos. Trehan’s original video carries the logo of a media house in the top right and a subscription button in the bottom right while Shetty’s original video carries the logo of Narayana Health in the top right. The other clips seen in the manipulated videos are not part of the original videos.

To discern the extent of A.I.-manipulation in the videos under review, we put them through A.I.-detection tools.

The voice tool of Hiya, a company that specialises in artificial intelligence solutions for voice safety indicated that there is a 79 percent probability that the audio track in the video apparently featuring Trehan was A.I.-generated.

Screenshot of the analysis from Hiya’s audio detection tool for supposed Trehan video

For the supposed Shetty video, the tool indicated that there is a 98 percent probability that the audio track in the video was A.I.-generated.

Screenshot of the analysis from Hiya’s audio detection tool for supposed Shetty video

Hive AI’s deepfake video detection tool pointed out markers on Trehan’s face in various frames throughout the video, indicating that the video was manipulated using A.I. However, their audio tool found very little evidence of A.I. manipulation in the audio attributed to Trehan.

Screenshot of the analysis from Hive AI’s deepfake video detection tool for supposed Trehan video

For the supposed Shetty video, Hive’s video tool did not indicate any A.I. manipulation in the video track. However, their audio tool indicated signs of A.I. manipulation in the accompanying audio track.

Screenshot of the analysis from Hive AI’s deepfake video detection tool for supposed Shetty video

The deepfake detector of our partner TrueMedia suggested substantial evidence of manipulation in the video apparently featuring Trehan. The “A.I.-generated insights” from the tool suggest that the transcript appears to be a scripted promotional piece, and that the mention of a “free video” further proves that this is not a genuine conversation.

Screenshot of the overall analysis and A.I.-generated insights from TrueMedia’s deepfake detection tool for supposed Trehan video

The tool gave an 89 percent confidence score to the “face manipulation detector” subcategory and a 61 percent confidence score to “video facial analysis” subcategory. The former detects potential A.I. manipulation of faces, as in the case of face swaps and face reenactment; and the latter checks video frames for unusual patterns and discrepancies in facial features.

The tool gave a confidence score of 96 percent to the subcategory of “A.I.-generated audio detector”, that detects A.I.-generated audio. It gave an 83 percent confidence score to the “audio authenticity detector” subcategory and a 78 percent confidence score to the “voice anti-spoofing analysis” subcategory — these checks analyse whether the audio was generated by an A.I. audio generator or cloning.

Screenshot of the video and audio analysis from TrueMedia’s deepfake detection tool for supposed Trehan video

The tool detected substantial evidence of manipulation for the supposed Shetty video as well. The “A.I.-generated insights” offered by the tool suggest that the transcript reads like an online scam, making exaggerated claims with promotional language, a common tactic in misleading advertisements.

Screenshot of the overall analysis and A.I.-generated insights from TrueMedia’s deepfake detection tool for supposed Shetty video

The tool gave an 84 percent confidence score to “deepfake face detector” subcategory that uses a visual detection model to detect faces and to check if they are deepfakes. The subcategories of “face manipulation detector” and “video facial analysis” received confidence scores of 63 percent and 42 percent respectively.

The tool gave a confidence score of 100 percent to the subcategories of “A.I.-generated audio detector” as well as “voice anti-spoofing analysis”; and a 98 percent confidence score to the “audio authenticity detector” subcategory.

Screenshot of the video and audio analysis from TrueMedia’s deepfake detection tool for supposed Shetty video

For a further analysis on the audio tracks in the videos we put those through the A.I. speech classifier of ElevenLabs, a company specialising in voice A.I. research and deployment. The tool returned results indicating that there is a 60 percent probability that the supposed audio track of Trehan’s was generated using their platform and a 96 percent probability that the supposed Shetty audio was generated using their platform.

For expert analysis, we shared the two videos with our detection partner ConTrailsAI, a Bangalore-based startup with its own A.I. tools for detection of audio and video spoofs.

The team ran both the videos through audio as well as video detection models, the results that returned indicated high confidence for manipulation in both the videos.

For the supposed Trehan video, they noted that the lips and the area around it looked more pixellated than the rest of the face and the audio tonality sounded unnatural.

Screenshot of ConTrails AI's visual analysis for Trehan’s supposed video
Screenshot of ConTrails AI's audio analysis for Trehan’s supposed video

For the supposed Shetty video, they added that the teeth and lip movement in the video look very animated with respect to the rest of the face, indicative of a lip-sync attack. The tonality of the voice was also deemed unnatural by the team.

Screenshot of ConTrails AI's visual analysis for Shetty’s supposed video
Screenshot of ConTrails AI's audio analysis for Shetty’s supposed video

To get another expert to weigh in on the videos, we reached out to our partners at RIT’s DeFake Project. Kelly Wu from the project observed that the video apparently featuring Trehan is low-quality and the main subject featured has minimal mouth movements.

Pointing to a particular moment in the video, Ms.Wu noted that while the audio can be clearly heard and it sounds like a complicated word, there is no noticeable movement of the mouth. This, she explains, could be due to the video and audio tracks being out-of-sync.

She also observed that in the supposed Shetty video there are images that flash for very brief moments at different timestamps. She said that this is a trend that their team has noticed in other videos as well which they have analysed for the DAU.

She also pointed out that Shetty’s mouth region has a flickering effect as he talks, possibly because of his teeth, which appear to change as he speaks. She noted that sometimes the teeth look very sharp and distinct, and sometimes they can be very blurry as a white blob. Her observations indicate signs of manipulation.

On the basis of our findings and analysis from experts, we can conclude that original footage was used with synthetic audio to fabricate both videos. It is yet another attempt to tout dubious remedies by linking them to prominent doctors, such as Trehan and Shetty, in a bid to peddle health misinformation.

(Written by Debraj Sarkar and Debopriya Bhattacharya, edited by Pamposh Raina.)

Kindly Note: The manipulated audio/video files that we receive on our tipline are not embedded in our assessment reports because we do not intend to contribute to their virality.

You can read below the fact-checks related to this piece published by our partners:

Fact Check: Did Dr Naresh Trehan promote a quick fix for high blood pressure?

Fact Check : डॉ. नरेश त्रेहन का AI जेनरेटेड वीडियो वायरल, उन्‍होंने नहीं किया बीपी और हार्ट अटैक की दवा का विज्ञापन

हाई ब्लड प्रेशर का इलाज बताते डॉक्टर नरेश त्रेहन का यह वीडियो डीपफेक है

Fact Check: Is Dr Devi Shetty promoting a diabetes cure in just 37 hours?

Fact Check: Video of Dr Devi Shetty Offering Diabetes Remedy Is AI-manipulated