From Arnold Schwarzenegger, to Kim Kardashian, to maybe, just maybe: You.
The rise of deepfakes — the eerily lifelike impersonations that superimpose heads onto bodies and bodies onto heads in seamless ways that integrate speech and movement, where nothing is as it seems — may be good for a laugh.
Right now, they’re fodder for viral videos that spur a bit of shock and a bemused “how did they do that?”
But the stats show the rise of a cottage industry (now focused on political figures, celebrities and, perhaps predictably, pornography) that could become a cybersecurity challenge, as seen from an identity verification perspective.
Research from cybersecurity firm Deeptrace shows there were nearly 14,700 deepfake videos online as of November of this year, compared to nearly 8,000 in December 2018. Other studies show that deepfakes can be created with a relatively small amount of “input” material, such as videos or pictures, tied to algorithms.
Leave it to the fraudsters, of course, to leverage new technology in their never-ending quest to trick banks, companies and governments, to gain access to accounts, or to set up accounts to move or steal money.
In an interview with PYMNTS, Reinhard Hochrieser, vice president of Product Management at authentication services provider Jumio, said when it comes to the emergence of deepfakes, “from an identity verification perspective, we are right at the beginning. But the technology is already extremely solid. It’s like a Hollywood special effect.”
In fact, the technology is so good that human observers cannot discern what is real from what has been concocted — and that’s exactly what the bad guys are banking on. Although cases have been isolated so far in terms of deepfakes (video or audio) being used to commit theft or fraud, it might be only a matter of time before a snowball effect occurs.
Hochrieser related an incident in which fraudsters used artificial intelligence (AI) to impersonate the CEO of a German firm’s parent company, which in turn spurred another executive to wire nearly $250,000 to a foreign bank account.
“That’s just one example to show that this is real, and AI is already being used,” said Hochrieser, who added that fraudsters will increasingly use AI and deepfakes to bypass identity verification and authentication systems.
The best defense against the fraudster armed with AI is to use AI to fight the battle.
As Hochrieser explained, it’s becoming crucial for businesses and government agencies to think about the technologies that need to be deployed in the looming battle against deepfakes. The right approach for biometric authentication, which seeks to determine whether it is a real person sitting in front of a computer or wielding a smartphone before allowing them to create or access bank accounts (as one example) lies in liveness detection.
Liveness detection offers an additional layer of detection, preventing fraudsters from deploying bots or videos, photos and doctored documents as they seek to bypass verification protocols.
Hochrieser cautioned that it is important to deploy certified 3D liveness detection methods. Uncertified methods rely on “tells,” such as blinks, nods and other verification prompts, which can be spoofed by deepfakes. By way of contrast, he advocated the use of certified liveness detection methods that are in turn tied to a global biometric standard known as ISO 30107.
Speaking of his own firm’s efforts with such certification in place, he said, “we are able to catch all of these different spoofing attempts, such as pre-recorded videos and 3D facemasks.” He also noted that deepfake videos are two-dimensional, which means effective 3D liveness solutions can determine whether it’s a video being presented for verification or a 3D human face. Liveness detection can also pick up on gradations of skin texture when videos are projected with 3D heads.
Call it a multi-faceted line of defense — and, as Hochrieser told PYMNTS, the battle against deepfakes will always be waged with AI and machine learning.
“It’s impossible [for human eyes] to spot these differences, but for an algorithm, it is not,” he said.
Data matters, of course, and large data sets are invaluable (which is where humans come in, ensuring good data makes its way into models).
Hochrieser said, too, that user experience is key when it comes to organizations tackling deepfakes. Using certified 3D liveness solutions in real time simply requires that users take selfies to capture high-resolution images of their 3D faces — an intuitive process for many people.
Early adopters of 3D liveness detection include challenger banks, government agencies and even car rental agencies, he told PYMNTS, while future use cases might involve using the method to “unlock” high-value transactions.
Robust liveness detection is going to be critical sooner rather than later, he told PYMNTS.
“Right now, it is a pretty sophisticated and time-intensive process to create a deepfake video,” Hochrieser said. “You have to have deep computer science knowledge.”
But technology evolves quickly, he said, adding that the tools needed to create deepfakes will soon be in reach of, well, everybody in just a few years.
“You would easily be able to create deepfake videos yourself on your personal computer at home,” he told PYMNTS. “And this is a huge threat for the entire industry. That’s why it is so important to apply the right technology right now.”