As technology evolves globally, fact-checkers and journalists are confronted with the rising challenge posed by tools which purveyors of disinformation use in creating fake videos, images, and audio that depict individuals saying or doing things they never said or did.
Does a 50-second video show authentic remarks by U.S. State Department spokesperson Matthew Miller discussing "military targets" in the Russian city of Belgorod, with "virtually no civilians left" in that city?
No, that's not true: The video mixes video of different briefings, during which Miller made no such remarks. The words falsely attributed to him in the video were AI-generated. The State Department labeled the video a deepfake.
The video of fake remarks was also posted by the Russian Embassy in South Africa account on X, but later that post was deleted.
The fake Cruise video, which appeared on the Telegram messaging platform last year, is called Olympics Has Fallen and uses artificial intelligence-generated audio of the film star's voice to present a 'strange, meandering script' disparaging the IOC. The documentary, whose title riffs on the Gerard Butler action film Olympus Has Fallen, also claims falsely to have been produced by Netflix and is promoted with bogus five-star reviews from the New York Times and the BBC.
An excerpt from a broadcast on X (formerly Twitter), in which Elon Musk praises Putin and Russia, was generated by a neural network and shared initially by Russian sources. The original recording of the broadcast with Musk, where he says no such thing, is available online.
The video report spreading online is fake, and the story about the Ukrainian scammers is itself made up. The Times of Israel did not publish such information on its website or its social networks.
The video, which the Russian media presented as an official promotional campaign of the Ukrainian Armed Forces, is fake. This compilation of footage from random sources was broadcasted only by propaganda sources.
Since Russia's invasion of Ukraine in February 2022, Ukrainian President Volodymyr Zelenskyy has been the subject of numerous false rumors, particularly in the form of doctored or misleading photographs and video footage.
From a manipulated deepfake video of Zelenskyy supposedly telling Ukrainian soldiers to surrender to Russia to false claims he displayed Nazi logos on his clothes, there is no shortage of examples.
In the video, President Joe Biden appears to say he's reinstating the draft so the U.S. can help defend Ukraine against Russian forces. The video is a deepfake.
An Al Jazeera spokesperson told VERIFY in an email: "The video in question is completely fake and Al Jazeera never published this or any other material related to it."
Two viral videos purported to show that Ukraine President Volodymyr Zelenskyy uses cocaine. One video appeared to show cocaine on Zelenskyy's desk as he spoke, while the other appeared to show him saying that he uses the drug routinely.
The first video was doctored; in the original, there is no white substance on Zelenskyy's desk. The second video was deceptively edited; in the original, he denied using drugs.
Our ruling
Two videos claimed to show Zelenskyy either using or admitting to using cocaine.
The first video purports to show a white powder on Zelenskyy's desk as he placed a video call to Musk. But the video was doctored to include the substance, which is not in the original.
The second video appears to show Zelenskyy talking about using cocaine regularly, but the video was deceptively edited to reorder Zelenskyy's words. In reality, Zelenskyy denied using drugs.
We rate these videos Pants on Fire!