Why Alexa Won’t Replace Call Center Agents (Just Yet)

Remember how Barbie — at first just a popular doll with pretty hair and clothes — eventually became a teacher, gymnast, veterinarian, astronaut, paratrooper, NASCAR driver, business executive and even President of the United States?

Amazon’s Alexa has been building a similar resume, adding well over 15,000 skills to its voice-activated repertoire — a 50 percent increase just from February to July of 2017.

Last November, Amazon introduced the world to Lex, the artificial intelligence service that powers voice and text conversations for Alexa. In the spring, Lex became available to all developers who wished to include a voice assistant chat feature in their mobile apps.

One of the eCommerce giant’s goals with Lex has been to get her to a point where she can take over some basic call center functions. Doing so would benefit callers by giving them more direct access to answers — with no more frustrating phone menus!

On the business side, Lex could potentially increase efficiency and cost-effectiveness by reducing the number of live agents needed to handle calls (as well as capital expenditures on supplies, such as desks and phone sets) and tapping live staff only for more complicated queries.

Although Lex is now a year old and has been previewed by companies, including the American Heart Association and HubSpot, there are those who say she still has a lot of growing up to do before she can truly take the place of call centers, either in whole or in part — and that’s despite the fact that self-service call environments are still using some truly archaic authentication methods.

“I don’t think Alexa is road-ready for true voice biometric determination for an MFA [multi-factor authentication], even in-home,” IntraNext CEO Patrick Brown told Karen Webster in a recent interview. “How confident can I be that my Echo Show validated who I am when it comes to banking and purchasing? I’m a believer that Alexa’s always getting smarter, but I’d like to learn more about the API security model.”

Brown and Webster agreed that, as an authentication method, Alexa (and voice in general) still provides far more questions than answers. Voices, noted Webster, are easy to spoof and are not secure enough on their own to be used for identity verification.

Despite the rapid pace of evolution in the industry, Brown said he has yet to see a good solution — and IntraNext’s customers, like many, are moving slowly, unsure where to place their trust. The executive said IntraNext isn’t fielding as many questions following the Equifax data breach as might be expected. Customers are not ready to make final decisions, and they certainly don’t want to make knee-jerk reactions.

“They’re still in the shell-shocked or research phase,” Brown said. “There are smart folks behind the scenes looking for the best solutions.”

One strategy that some call centers have already enacted is proactive callbacks. If the dots don’t connect properly, an agent will call the customer back on a separate line that is already on file — say, a home landline as opposed to a cell phone — to complete changes to service and similar orders.

Generally speaking, Brown recommends building higher barriers and creating further challenges to step up authentication. Today, he said, callers are subjected to an initial screening, but they are only escalated to the next level of authentication if something seems fishy.

Brown said that’s not enough anymore. The information that must be presented in that initial screening is far too easily knowable to create any real security — and again, as Webster noted, voices are easy to spoof.

He gave the example of a recent call he made to the Social Security administration. En route, Brown said he encountered all the same multi-factor authentication elements that have been around for decades.

“If I were active on social media, that would be readily accessible to anybody,” he said. And with 2 billion active users on Facebook alone, that risk is very real for a very large portion of the population globally.

IntraNext’s response? Brown said the company is focused on securing data during live agent call interactions. For the foreseeable future, humans will still be needed for complex queries, he said, and the goal is to make those interactions as secure and as customer-friendly as possible.

Brown also said that strategic partnerships with companies that specialize in early fraud detection tools, checking IP addresses and validating cell phone networks, is a path that is actively being pursued.

The end result of these strategies and partnerships, he said, will hopefully be to nip fraud in the bud. However, there is no single company that can effectively secure it all; “it will take all of us, working together in our areas of specialty, to put up continued roadblocks for fraudsters,” Brown said.

Self-care, chat bots, artificial intelligence, and machine-to-machine learning is here to stay. The first step in effective fraud prevention is to “scrub out the bad guys, right out of the gate,” Brown said.

Finally, the executive noted that real-time forensics and biometrics will play a key role in the solution.

Call centers must determine whether the customer is contacting them from the expected location — i.e., their home. Real-time forensics and biometrics can help with that. If the customer is using the new call feature on an Echo device, Brown said, then pinpointing the location of origin shouldn’t be too difficult.

Of course, it would change the game if Alexa started making calls on behalf of customers. Brown believes that machine-to-machine communications are coming, in which voice assistants handle customer service on behalf of their owners, just as a live personal assistant would do.

At that point, verifying the device’s location won’t be enough, but how else is a robot to prove that another robot is truly representing the customer it claims it is? Brown and Webster will explore this emerging challenge in a future discussion, so stay tuned.