Your Brain On AI

Almost overnight, talking to one’s computer or other electronics changed from being something people rarely did unless they were extremely angry (who hasn’t yelled at a computer that ate their file?) to something that is well on its way to becoming as normal as flipping on a light switch or swiping a phone screen. Alexa, Cortana, Bixby, Allo, Siri — if we learned nothing else from CES last week, it is that the virtual personal assistants powered by AI are coming soon to a computer/car/lamp/speaker/smartphone/refrigerator/television near you.

There are even opportunities to talk to AI for those who don’t want to use their voice. For about a year, an army of them have been on the loose on Facebook Messenger (among many, many other places) ready at a moment’s notice to help.

Or try to help anyway.

The problem with AI is that the concept is often somewhat better than the execution. PYMNTS CEO Karen Webster’s long-running feud with Poncho, Facebook’s Weather Cat AI is well-documented and becoming more acrimonious by the day. A handful of Amazon AI users have complained that, if one puts their Echo too close to the TV, Alexa might order you an unwanted pizza when a certain commercial comes on.

And, as Clinc CEO and Cofounder Professor Jason Mars told Karen Webster in a recent chat, even past the bugginess issues, there is a certain stiffness problem when it comes to talking to the various AIs in the market, even when they are working just fine.

“Alexa is a platform where any company or user can create their own little skills that live in Alexa. So, Uber can program in the command ‘Call me an Uber,’ and Alexa will send it straight away. But you can’t say, ‘I need a ride to the library,’ because that’s not a programmed command. It is a constrained experience for a merchant or vendor. If you want an Alexa skill, you have to build it yourself.”

Build it yourself — and then train consumers to use it by learning the correct voice commands.

And this, Mars and his cofounders, Professors Lingjia Tang and Michael Laurenzano, said, gets at the central problem that all the contemporary AIs are trying to solve: finding a way for computers to easily and naturally interact with machines using natural human language.

And while there are many solutions emerging, Mars noted, that background problem is still persisting and crying out for a better solution.

The Clinc team thinks they just might have built one.

 

Rethinking Machine Learning 

All three cofounders are University of Michigan professors with access to “the best science and technology you can find today,” according to Mars. And that access made something very apparent: For all the neat skills they have, the emerging AIs heating up the markets today actually aren’t all representative of where computer really is with AI and what kinds of innovations are actually possible.

“The key difference between what we are doing and how other systems work is that we are focused around being able to understand unstructured, unconstrained speech. So, you can speak to it [Clinc’s AI, Finie] like it is a human in the room.”

That, he noted, is a bit different than the approaches currently in the market — even those that are trying to teach AIs to understand natural human language. Mars explained that, in most AIs coming to market, they are using “grammar-based approaches.”

“So, in a rule-based, grammar-based system, the AIs look for nouns, verbs and then do some analysis on what adjective is corresponding to this noun and then tries to follow rules about how it should respond. That means there is a lot of very specific programming, and it still wouldn’t get close to capturing the breadth and depth of the English language.”

Clinc, on the other hand, leverages recurrent neural networks that are designed around the actual neurons in actual human brains. Instead of programming commands and rules for Finie to follow, the Clinc team is instead trying to program it to think.

“We go through a period of training the brain. We reason about the intelligence we want — say, knowing about spending. Then, we collect examples of humans asking questions about spending all around the world and then curate those into the brain so the AI learns without having to program constrained rules,” Mars explained. “Doing that, we can can train the system to understand very messy ways of asking questions because the neural networks themselves are able to recognize things that are similar — though not identical — to what it’s trained with.”

It is literally, he said, designed to emulate the way human brains learn.

 

So, What To Do With So Smart A Program?

Finie, since Clinc’s launch about a year and a half ago, has been focused on one problem: helping consumers connect with their financial stories.

But Clinc’s clients aren’t customers; they are financial institutions looking for an Alexa of their own to help customers navigate a variety of financial functions. That means consumers won’t ever have to download Finie to their phone, but they will encounter it embedded in the mobile and investment sites of some of their favored FIs.

As for how it is being used, that varies. For some, it is a digital mobile banking assistant. But, he said, as they’ve seen an ever-growing upswell of interested potential partners, the number of custom use cases has been on the rise as well.

“Intuit is looking to see how can they build an intelligent assistant that consumers can talk to while they are filling out their taxes,” Mars noted by way of an example.

But Finie is capable of knowing much more and about a wider variety of topics, because, at base, it is not a financial services AI; it is a learning AI that happens to be employed in financial services. But that could — and likely will — be employed across a whole host of verticals.

“Retail is a huge use case. You can imagine wanting to be able to ask a kiosk where the bread is and being able to have these kinds of systems understand. We’ve had some conversations with Samsung and their refrigerators you can talk to. The firms we are seeing are interested in not having to base their AI systems around customers having to learn the right key phrases to interact. That adds a burden to the users because they then have to learn how to speak to these interfaces.”

And, under other models, those that don’t want to burden their customers will end up burdening themselves.

“For example, in financial services, that would mean you would end up having to program 30,000 different ways of asking, ‘What is my balance?’”

 

Solving The Baseline Problem

The basic problem for AI to solve is making regular users able to easily articulate what they want to an AI and get an appropriate response.

And that problem, Mars said, is a constant, “whether it takes the form of a chatbot or a voice-activated personal assistant.”

It also, he noted, isn’t going anywhere unless the technology on offer actually offers an easily usable touchpoint.

“We don’t care what you call it. We just want it to be the right solution for connecting you to your data through a very simple AI.”

It’s a big goal and a tough one considering how many other AIs are out there in the field (and how powerful their backers are).

But if Clinc’s AI can outthink them, the AI race may have just gotten a bit more interesting.