Almost overnight, talking to oneâs computer or other electronics changed from being something people rarely did unless they were extremely angry (who hasnât yelled at a computer that ate their file?) to something that is well on its way to becoming as normal as flipping on a light switch or swiping a phone screen. Alexa, Cortana, Bixby, Allo, Siri â if we learned nothing else from CES last week, it is that the virtual personal assistants powered by AI are coming soon to a computer/car/lamp/speaker/smartphone/refrigerator/television near you.
There are even opportunities to talk to AI for those who donât want to use their voice. For about a year, an army of them have been on the loose on Facebook Messenger (among many, many other places) ready at a momentâs notice to help.
Or try to help anyway.
The problem with AI is that the concept is often somewhat better than the execution. PYMNTS CEO Karen Websterâs long-running feud with Poncho, Facebookâs Weather Cat AI is well-documented and becoming more acrimonious by the day. A handful of Amazon AI users have complained that, if one puts their Echo too close to the TV, Alexa might order you an unwanted pizza when a certain commercial comes on.
And, as Clinc CEO and Cofounder Professor Jason Mars told Karen Webster in a recent chat, even past the bugginess issues, there is a certain stiffness problem when it comes to talking to the various AIs in the market, even when they are working just fine.
âAlexa is a platform where any company or user can create their own little skills that live in Alexa. So, Uber can program in the command âCall me an Uber,â and Alexa will send it straight away. But you canât say, âI need a ride to the library,â because thatâs not a programmed command. It is a constrained experience for a merchant or vendor. If you want an Alexa skill, you have to build it yourself.â
Build it yourself â and then train consumers to use it by learning the correct voice commands.
And this, Mars and his cofounders, Professors Lingjia Tang and Michael Laurenzano, said, gets at the central problem that all the contemporary AIs are trying to solve: finding a way for computers to easily and naturally interact with machines using natural human language.
And while there are many solutions emerging, Mars noted, that background problem is still persisting and crying out for a better solution.
The Clinc team thinks they just might have built one.
Rethinking Machine LearningÂ
All three cofounders are University of Michigan professors with access to âthe best science and technology you can find today,â according to Mars. And that access made something very apparent: For all the neat skills they have, the emerging AIs heating up the markets today actually arenât all representative of where computer really is with AI and what kinds of innovations are actually possible.
âThe key difference between what we are doing and how other systems work is that we are focused around being able to understand unstructured, unconstrained speech. So, you can speak to it [Clincâs AI, Finie] like it is a human in the room.â
That, he noted, is a bit different than the approaches currently in the market â even those that are trying to teach AIs to understand natural human language. Mars explained that, in most AIs coming to market, they are using âgrammar-based approaches.â
âSo, in a rule-based, grammar-based system, the AIs look for nouns, verbs and then do some analysis on what adjective is corresponding to this noun and then tries to follow rules about how it should respond. That means there is a lot of very specific programming, and it still wouldnât get close to capturing the breadth and depth of the English language.â
Clinc, on the other hand, leverages recurrent neural networks that are designed around the actual neurons in actual human brains. Instead of programming commands and rules for Finie to follow, the Clinc team is instead trying to program it to think.
âWe go through a period of training the brain. We reason about the intelligence we want â say, knowing about spending. Then, we collect examples of humans asking questions about spending all around the world and then curate those into the brain so the AI learns without having to program constrained rules,â Mars explained. âDoing that, we can can train the system to understand very messy ways of asking questions because the neural networks themselves are able to recognize things that are similar â though not identical â to what itâs trained with.â
It is literally, he said, designed to emulate the way human brains learn.
So, What To Do With So Smart A Program?
Finie, since Clincâs launch about a year and a half ago, has been focused on one problem: helping consumers connect with their financial stories.
But Clincâs clients arenât customers; they are financial institutions looking for an Alexa of their own to help customers navigate a variety of financial functions. That means consumers wonât ever have to download Finie to their phone, but they will encounter it embedded in the mobile and investment sites of some of their favored FIs.
As for how it is being used, that varies. For some, it is a digital mobile banking assistant. But, he said, as theyâve seen an ever-growing upswell of interested potential partners, the number of custom use cases has been on the rise as well.
âIntuit is looking to see how can they build an intelligent assistant that consumers can talk to while they are filling out their taxes,â Mars noted by way of an example.
But Finie is capable of knowing much more and about a wider variety of topics, because, at base, it is not a financial services AI; it is a learning AI that happens to be employed in financial services. But that could â and likely will â be employed across a whole host of verticals.
âRetail is a huge use case. You can imagine wanting to be able to ask a kiosk where the bread is and being able to have these kinds of systems understand. Weâve had some conversations with Samsung and their refrigerators you can talk to. The firms we are seeing are interested in not having to base their AI systems around customers having to learn the right key phrases to interact. That adds a burden to the users because they then have to learn how to speak to these interfaces.â
And, under other models, those that donât want to burden their customers will end up burdening themselves.
âFor example, in financial services, that would mean you would end up having to program 30,000 different ways of asking, âWhat is my balance?ââ
Solving The Baseline Problem
The basic problem for AI to solve is making regular users able to easily articulate what they want to an AI and get an appropriate response.
And that problem, Mars said, is a constant, âwhether it takes the form of a chatbot or a voice-activated personal assistant.â
It also, he noted, isnât going anywhere unless the technology on offer actually offers an easily usable touchpoint.
âWe donât care what you call it. We just want it to be the right solution for connecting you to your data through a very simple AI.â
Itâs a big goal and a tough one considering how many other AIs are out there in the field (and how powerful their backers are).
But if Clincâs AI can outthink them, the AI race may have just gotten a bit more interesting.