Video of Israeli Dr. Ronni Gamzu Promoting a Cure for Parasites Is Fake

September 19, 2024
September 19, 2024
Manipulated Media/Altered Mediablog main image
Screengrabs of the video analysed by the DAU

The Deepfakes Analysis Unit (DAU) analysed a video featuring Ronni Gamzu, a prominent doctor and hospital administrator in Israel, purportedly promoting a cure for parasites affecting humans. After we put the video through A.I. detection tools and got our expert partners to weigh in, we were able to conclude that an original video featuring Dr. Gamzu was fabricated with fake audio. 

The one-minute-and-four-second video in English, was sent to the DAU tipline for assessment. The video opens with visuals of Gamzu speaking to the camera in a study-like setting with stacks of books in the backdrop. The male voice that can be heard along with the visuals, accuses pharmaceutical companies of selling ineffective pills to cure parasitic diseases and promises a product — without naming it — that could serve as an effective cure.

The audio track mentions that the link for ordering the supposed product is available at the end of the video. However, no such link appears, instead the video ends abruptly with an idyllic image. Captions in Hebrew appear throughout the video.

The lip-sync in the video is imperfect. Gamzu’s lips seem to quiver and his chin appears to distort when his mouth opens and closes while speaking. His teeth look blurry and his dentition seems inconsistent. For a four second segment Gamzu’s face appears to be static with only his lips moving and his eyes blinking oddly in a puppet-like manner.

Dr. Ronni Gamzu’s chin seems to distort as he speaks

The voice and the accent heard in the video sound similar to Gamzu’s, based on a comparison with his voice and accent as heard in his publicly available interviews and speeches. However, it is less convincing when it comes to the characteristic pauses and the changes in tone observed in his usual style of delivery.

After running a reverse image search using screenshots of Gamzu from the video, we were able to locate this video published on Sept. 2, 2021, from the official YouTube channel of i24NEWS English, an Israel-based television network broadcasting in several languages.

Gamzu’s clothing, backdrop, and body language in the original video and the one we analysed were identical. He speaks in English in both the videos but the audio is completely different. There is no mention of any cure for parasites in the original video, in which he is being interviewed by a male news anchor; the visuals focus on the two men, supers can be seen in English and not Hebrew as in the case of the doctored video. The video frames in the doctored video appear cropped, the media house logo visible on the top left corner of the original video is also missing.

On comparing the hand gestures of Gamzu in the two videos, it appears that separate clips from the original video were stitched together to produce about a minute-long video track featuring only Gamzu, and a fake audio track was added to it.

To discern if A.I. was used to manipulate the video we ran it through A.I. detection tools.

The voice detection tool of Loccus.ai, a company that specialises in artificial intelligence solutions for voice safety returned results which indicated that there was a very high probability that the audio track in the video was A.I.-generated.

Screenshot of the analysis from Loccus.ai’s audio detection tool

Hive AI’s deepfake video detection tool highlighted a small portion in the video where the tool seems to have picked up signs of manipulation using A.I. Their audio tool also gave indicators of A.I. use in the audio track.

Screenshot of the analysis from Hive AI’s deepfake video detection tool

The deepfake detector of our partner TrueMedia suggested substantial evidence of manipulation in the video. In a breakdown of the overall analysis, their tool gave 100 percent confidence scores to “voice anti-spoofing analysis” and “A.I.-generated audio detector”, both subcategories indicate the presence of A.I. in an audio track. It also gave a 99 percent confidence score to the “audio authenticity detector” subcategory, which analyses the track for evidence of it being a voice clone or a product of an A.I. generator.

The tool gave a 96 percent confidence score to the “face manipulation detector” subcategory, which detects the likelihood of A.I. manipulation of faces as in the case of face swaps and face reanactment; in this case it analysed Gamzu’s face. It only gave a 32 percent confidence score to “video facial analysis”, a subcategory analysing video frames for unusual patterns and discrepancies in facial features.

A new feature in the tool titled “AI-generated insights” offers additional contextual analysis on the video. It suggests that the video resembles a typical advertisement for a health product, and uses fear-based tactics to persuade the audience without offering any genuine medical or scientific discussion.

Screenshot of the overall analysis from TrueMedia’s deepfake detection tool
Screenshot of the audio and video analysis from TrueMedia’s deepfake detection tool

We also ran the video through Deepfake-o-meter, an open platform developed by Media Forensics Lab (MDFL) at UB for deepfake image, video, and audio detection. The tool gives an option of various classifiers through which a media file, in this case video, can be run to receive analysis.

We chose the LIPINC (2024) classifier, which focuses on lip-synced deepfake detection based on mouth inconsistencies. The results from the classifier indicated a very high likelihood of the video being fake with 100 percent probability that it was digitally manipulated and had signs of a lip-sync deepfake; a type of manipulation in which a person’s lip movements are generated using A.I. models to match altered audio.

Screenshot of the analysis from Deepfake-O-Meter’s LIPINC classifier

For a further analysis on the audio, we also put it through the A.I. speech classifier of ElevenLabs, a company specialising in voice A.I. research and deployment. It returned results as “very unlikely” indicating that it was highly unlikely that the audio track featured in the video was generated using their software.

We reached out to ElevenLabs for a comment on the analysis. They told us that they were not able to confirm that the audio was A.I.-generated. They added that they have been actively identifying and blocking attempts to generate prohibited content, however, the exact generation of this audio from their platform has not been confirmed.

To get another expert to weigh in particularly on the audio, we escalated it to our partner IdentifAI, a San-Francisco based deepfake security startup. They told us that the video was a poor example of a lip-sync deepfake. The team added that there is a significant delay where the audio is not matched up directly to the lips.

The audio verification by the team suggested that the audio was more likely a case of someone else speaking or a very poorly constructed audio deepfake. They added that they were able to draw that conclusion based on the quality and structure of the audio, which is not indicative of a deepfake or voice cloning model.

For another expert view on the video, we reached out to our partners at RIT’s DeFake project. Saniat Sohrawardi and Kelly Wu from the project stated that the video shows clear presence of artefacts stemming from the usage of a lip-sync deepfake video generation model, and pointed out various instances where the artefacts appear.

The team noted that Gamzu’s head went unnaturally static during the tail end of the video and the mouth movements became more pronounced, while his tone did not necessarily change.

Gamzu's head becomes unnaturally static with pronounced mouth movements

They added that Gamzu’s teeth unnaturally change across a few frames and that the artefact on the chin hints toward the existence of a bounding box where the mouth was regenerated. A bounding box is a rectangular frame which is used in computer vision and image processing to mark where an object is located in an image or video.

Artefact on Gamzu's chin hints at a regenerated mouth area

The team added that the artefact on the chin is more pronounced in some frames where the chin can be seen extending unnaturally. They noted that this could be a result of the algorithm failing to extract proper landmarks — structural details of the face — which led to generation of a frame where the generated mouth does not match the previous frames.

They also remarked that beyond the visible artefacts, there are plenty of instances in the video where the audio and the mouth do not properly sync.

On the basis of our findings and analysis from experts, we can conclude that the video of Gamzu promoting a cure for parasites is fake. His original video was fabricated with an audio track that is highly likely synthetic or a voice dub.

(Written by Debraj Sarkar and Debopriya Bhattacharya, edited by Pamposh Raina.)

Kindly Note: The manipulated audio/video files that we receive on our tipline are not embedded in our assessment reports because we do not intend to contribute to their virality.