Video Promoting Investment Scheme Tied To Musk’s Starlink Is Fake

June 24, 2024
June 17, 2024
Cheapfakes/Shallowfakes blog main image
Screengrabs of the video sent to the DAU tipline

The Deepfakes Analysis Unit (DAU) analysed a video featuring a couple of international billionaires purportedly promoting a lucrative investment scheme. After examining the video using A.I. detection tools and escalating it to our expert partners for assessment, we concluded that the video was created by stitching together unrelated visuals with an A.I.-generated audio track.

The 66-second video in English was sent to the DAU tipline for assessment. It opens with a close-up of a man identified through the voice-over as Israel Englander, a hedge fund firm owner. The voice-over in the same male voice can be heard throughout the video over a string of photos and video clips, including that of Elon Musk, founder of electric car company Tesla and the rocket maker SpaceX.

The audio track peddles a supposed collaboration between Mr.Englander and Mr.Musk for an investment scheme to benefit Israeli citizens, linked to Starlink, a subsidiary of SpaceX focussed on satellite internet technology. A symbol resembling the official logo of Starlink is visible on the upper left corner of the video from the eight-second mark onward.

There is no consonance between the visuals and the narration. The apparent sound bite of  Englander between the 41-second and 44-second mark and that of Musk between the 55-second and one-minute and six-second mark does not carry the original audio but the same voice-over that can be heard over the rest of the visuals.

To determine the authenticity of the context put forth through the narration, which is static with no change in pitch or tone, we undertook a reverse image search using screenshots of Englander’s and Musk’s clips from the video.

Englander’s visuals led us to this video published on Dec. 16, 2009 from the Youtube channel of Opalesque TV which features interviews of hedge fund experts. The snippet between the 48-second and 52-second mark matches a clip of Englander’s seen in the video under investigation. While Musk’s visuals led us to this video, featuring an interview of his, published on Nov. 1, 2023 from the Youtube channel of The Telegraph; a clip from this interview can be seen in the manipulated video though with a cropped backdrop.

In neither of the original interviews do the two businessmen mention anything about a collaboration or investment scheme linked to Israel. None of the other visuals in the manipulated video could be traced to the original videos. There is no sign of the Starlink logo in the original videos.

To discern if A.I. had been used to manipulate the visuals and the audio track in the video, we put it through A.I. detection tools.

TrueMedia’s deepfake detector overall categorised the video as having “substantial evidence of manipulation”. The tool gave 100 percent confidence score to the subcategory of “A.I. generated audio detection”, an indicator of synthetic speech.

Screenshot of the overall analysis from TrueMedia's deepfake detection tool
Screenshot of the analysis from TrueMedia's deepfake detection tool

Since the visuals in the manipulated video were largely photos pieced together with short clips but without the original audio, the A.I. video detection tools were unable to indicate the manipulations in the visual elements.

Hive AI’s deepfake video detection tool did not detect A.I. manipulation in the visuals. However, its audio tool picked up indicators of A.I. generated audio in some bits.

We also ran the video through the audio detection tool of our California-based partner DeepTrust to discern moments in the video that had a high percentage of synthetic audio.

Screenshot of the analysis from DeepTrust's audio detection tool

The heat-map above indicates that there is a 53.3 percent probability of A.I.-generated audio in the video. The tool computes this average score based on the pattern of generated speech that it identifies in the video. The green strips in the heat-map stand for real audio and red for A.I.-generated audio. The red bits could also denote background noise or music in those portions of the audio.

We also reached out to ElevenLabs to get a further analysis on the audio. They told the DAU that they were able to confirm that the audio is A.I.-generated. They added that the user who broke their “terms of use” by generating this synthetic audio has already been identified as a bad actor through their in-house automated moderation system and blocked from using their tools.

On the basis of our findings and expert review, we have assessed that an A.I.-generated audio track was used to produce a fake video using disparate visuals.

(Written by Debraj Sarkar with inputs from Debopriya Bhattacharya, and edited by Pamposh Raina.)

Kindly Note: The manipulated audio/video files that we receive on our tipline are not embedded in our assessment reports because we do not intend to contribute to their virality.