As Face-to-Face Goes Digital, Identity Defenses Must Come In Layers

In the newest, emerging phase of eCommerce, where interactions increasingly take place across platforms, between apps and devices and through “digital front doors,” we’ll need “layers” of authentication of people, places and things to keep us safe, Ilan Maytal, chief data officer of AU10TIX, told PYMNTS.

The conversation took place against the backdrop of jointly conducted sharing economy research between PYMNTS and AU10TIX that showed we’re moving from one-time identity verification efforts to systems that continuously verify individuals.

At a time when platforms and apps have taken the place of face-to-face interactions, “when you do things over the internet, people need to trust you,” said Maytal.  And, as he noted, continuous ID efforts can help ensure that sharing economy firms — and their customers — can safeguard people and their experiences.

Before the sharing economy took shape, Maytal said, most companies had a simple motivation to tackle identity verification: namely, to get as many customers onboarded as possible. Or, perhaps, they had to comply with local or worldwide regulations.

These firms, in many cases, would try to do the absolute minimum in terms of verification in a bid to not interrupt the customer journey, fully aware that such interruptions and friction would lead to churn.

The Influence Of The Sharing Economy On Identity Verification

But the sharing economy (with 86 million U.S. consumers) has introduced new dynamics into the mix, said Maytal. Corporate reputations can be made quickly and torn down just as fast. At a high level, for example, vacation rental firms must demonstrate that their properties are as advertised, and ride-hailing companies must know that their drivers and riders are who they say they are. Economics of this two-way sector are much more sensitive to the impact — or rumors of — fraud and abuse as trust is paramount when you’re sharing experiences.

“This is why we need ongoing verification of customers and service providers,” said Maytal.

A layered approach is necessary, which embraces biometrics and other advanced tech. Past efforts have been solely based on authenticating and examining paper documents. He said that firms like AU10TIX focus on boosting that core function with new capabilities like face comparison, liveness, and other biometric checks.

But, as he noted, the jury is still out as to which of these emergent technologies will be allowed to be deployed and which will not.

Maytal told PYMNTS that any number of governments worldwide, the U.S. included, are wrestling with how to define — and in some cases limit — the types of data that can be collected.  There’s always the spectrum of government regulation or compliance risk tied to privacy.

“That means that not everything we can provide [in terms of technology] can be used to execute or complete ID verification,” he told PYMNTS. He pointed to GDPR, which limits what can be gathered in Europe. In the U.S., there is a groundswell of support for making sure that consumers consent to the data that can be collected and used in verification efforts. Maytal predicted that there would be at least some support from the consumers themselves to offer up their likenesses, voices, and other telltale identity signals to providers.

Where Identity Education And Innovation Will Make An Impact

No matter which solutions gain wide embrace, Maytal cautioned that education would be key to ensuring that stakeholders know that privacy rules are strictly adhered to during verification.

“Many more customers do understand that for their safety, providers need to take more data from them” in order to, for example, compare their faces to an ID, he noted. “They’re ready to share their data if they understand the benefits.” That’s especially true as automation can scan and authenticate the details on offer with speed and accuracy, where the consumer journey is barely impacted.

For businesses, the urgency is there for additional layers of defense, especially where deepfakes are growing as an avenue of attack.

As Maytal described them, deepfakes are difficult to detect because the bad actors can take a legitimate ID off the dark web (for a small fee) and use that in tandem with lifelike videos and composite data to bypass even the most cautious security providers. “We need to deploy more layers to overcome these deep-fake issues,” (and those to come, as recently explored in the Wall Street Journal), he said, adding that “I don’t see the threat going away.”

The next generation of authentication tools will be biometric, he predicted, adding, “[there will be many more tools to deploy.]” Especially as our daily routines become more automated and interconnected, the importance of confidently linking physical and digital identities and assets will only increase and required further innovation.