Is AI’s Next Evolution To Digital Humans?

What would you do if you met a digital human?

It is not an idle question, nor are we asking how you would handle yourself if you suddenly landed in an alternative universe surrounded by robots and avatars.

As it turns out, digital humans are already among us.

Autodesk users have been interacting with them since the end of last year, when calling into customer support. Travelers on Air New Zealand have been utilizing the services of its digital travel concierge for a little more than six months.

Australians with disabilities are now able to work with a digital human named Nadia, designed to help users better navigate the National Disability Insurance Scheme (NDIS) and find the information they need. Nadia can read users’ emotions by “watching” their faces – not to mention give them the experience of talking to a celebrity, sort of: Nadia’s voice is provided by Academy Award-winning actress Cate Blanchett.

Very soon, the banking customers of NatWest will meet Cora, their new personal banking assistant, who will look them in the eye as she talks to them and helps them along their financial journeys.

So, what do all of these digital humans have in common?

A company called Soul Machines, which is building what it believes the next generation of interactive, conversational artificial intelligence (AI) will look like.

Because, according to Soul Machines’ chief business officer, Greg Cross, they will actually have to look like something. The age of AI, he said, will only become truly useful when the machines are familiar to the people who use them.

That will mean they will have to do better than just sound like us – they will need to look like us, too, he said.

To that end, the company has developed what it refers to as the world’s first Virtual Nervous System, from which it painstakingly renders the visually responsive, three-dimensional “virtual humans” – human-like avatars – that can interact “face-to-face with customers.”

“We actually believe that, in time, all assistants will need to have a human face, because as humans, we are programmed at a DNA level to want to be able to look at someone when they are talking to us,” Cross said.

Facing the Conversation 

The problem today with automated communication, Cross noted, is that it tends to feel a bit stiff and, well, robotic. The consumer’s experience is not only not enhanced – in some cases, it can even be actively worse than it was before.

And that, said Cross, is a big problem.

As he pointed out, anyone who wants to use a conversational interface to connect a human being and a smart machine basically has to solve for two issues. First, it has to be able to build a “highly personal and customized interaction for the customer. Then, it has to make sure that interactions can keep expanding – and that is because the AI is learning, and thus interacting more efficiently.”

The Power of  “Face-to-Face” Interactions

The intent, Cross noted, is not to trick the user into believing the avatar they are speaking to is a real person. The firms they work with across a variety of verticals, he said, make no effort to disguise the fact that customers are talking to a virtual human. Air New Zealand customers know they are talking to an AI avatar when they are dealing with the concierge service.

As Cross maintained, there is no need to hide it, because putting a literal face on the technology only makes the interaction that much better. Customers actually like talking to a visual avatar – after all, Cross noted, they already tend to think of Alexa as a “she” instead of an “it.”

“We believe it is a much more personalized customer experience,” he said. “From here, we get to a position where customers can really have a more intimate experience, because we are better able to create and convey the complex range of emotions the human face can convey.”

Moreover, he noted, the inclusion of a facial focal point makes some of the behaviors of the AI more palatable for human consumers.  For example, he noted, customers often don’t like the idea of a device using its camera to “scan” their face to read data about their mood – they tend to find it “creepy.” But the exact same activity doesn’t read as off-putting scanning when done by a virtual human – instead, it reads as the AI looking at the customer.

Which is why, Cross pointed out, the firm is building so many virtual humans for so many partners.

Putting a Face to a Name 

As of this week, NatWest has announced that it will be taking the “digital human concept for a ride to help cater to their consumers’ customer service needs in regard to getting answers to basic banking queries.”

At the start of 2017, the bank deployed a text-based chatbot named  “Cora,” which already can handle 200 basic banking queries and now holds 100,000 conversations a month. The goal of the partnership is two-fold, Cross noted. First, he said they are hoping to help Cora transfer all of those basic banking skills into the face-to-face personal interactions. That entails more than just a direct transfer of text-based conversational platform, because the translations are not exactly 1-1.

“People talking face-to-face use very different language than when they are texting,” Cross observed.

But beyond merely adopting Cora’s current skill set, he noted, the bigger goal and vision is to expand the universe of what she can do in cooperation with the customer as “they get to know each other better.”

“The end goal is to get Cora to a point where she can be a consumer’s personal banker – that is the hope for this digital human, that the more one uses it, the more helpful it is going to get,” Cross said. “That is the promise for the future in the interaction.” 

What’s Next

Cross and the team at Soul Machines know a thing or two about avatars. The amazing – if complicated – thing, Cross said, is that they are not developing this technology in any single direction, so much as they are building emotional content for these avataric smartbots, which can then spread those benefits over a range of use cases.

One of their avatars, Baby X, now has a body and can realistically move his arms and legs, meaning the world of gestural communication is opening up to the digital human being built in the very near future.

The grander vision, Cross noted, is to develop a series of tools that can be used so that everyone can have the customer-built digitized assistant they want or need, offered up freely to third parties.

But they are building it out to do some really different things. For example, Cross noted, they are currently using their Virtual Nervous System to construct a virtual human for someone who is no longer alive: specifically, an art grandmaster who has been dead for over 100 years.

“What we are looking at is creating a digital grandmaster artist,” he said. “So that someone at their favorite art gallery or museum can be standing in front of one of the great paintings of the world, having a digital version of that artist explaining the work.”

So, what would you do if you met a digital person? You might learn something, organize your finances, book a trip, get tech support or even meet the digital ghost of a great mind from generations past.

If Soul Machines has its way, that’s just scratching the surface.