Researchers have discovered a surprisingly simple way to detect deepfake video calls: ask the suspect to turn sideways.
The trick was shared this week by Metaphysic.ai, a London-based startup behind the viral Tom Cruise deepfakes.
The company used DeepFaceLive, a popular app for video deepfakes, to transform a volunteer into various celebrities.
Sign up to the TNW Conference newsletter
And be the first in line for ticket offers, event news, and more!
Most of the recreations were impressive when they looked straight-ahead. But once the faces rotated a full 90-degrees, the images became distorted and the spell was broken.
The team believes the defects emerge because the software uses fewer reference points to estimate lateral views of faces. This forces the algorithm unable to guess how it would look.
“Typical 2D alignment packages consider a profile view to be 50% hidden, which hinders recognition, as well as accurate training and subsequent face synthesis,” Metaphysic.ai‘s Martin Anderson explained in a blog post.
“Frequently the generated profile landmarks will ‘leap out’ to any possible group of pixels that may represent a ‘missing eye’ or other facial detail that’s obscured in a profile view.”
These weak spots can be strengthened, but it takes a lot of work.
YouTuber DesiFakes proved it was possible after adding a deepfake Jerry Seinfeld to a character in Pulp Fiction. But this required extensive post-processing. In addition, the profile-view of Seinfeld closely resembled the original actor.
Yet this is hard to replicate for the general public, because we’re rarely filmed or photographed in profile — unless we get arrested.
This can leave deepfake models with insufficient training data to generate realistic lateral views.
Metaphysic.ai’s research emerges amid growing concerns about deepfake video calls.
In June, several European mayors were duped by a video call from Vitali Klitschko.
Days later, the FBI warned that scammers were using deepfakes in interviews for fully-remote jobs that offer access to valuable information.
The side-on trick may not have saved all of the victims. Future 3D landmark systems may produce convincing profile views, while photorealistic CGI models could replace entire heads.
Nonetheless, the side-view trick adds a new chance to detect the fakers — and another reason to not get arrested.
Denial of responsibility! Toys Matrix is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – admin@ toysmatrix.com . The content will be deleted within 24 hours.