Consumers are increasingly comfortable chatting with their robot friends. A quick glance at the PYMNTS/Visa How We Will Pay survey indicated that a growing segment of consumers are already pursuing connected commerce experiences through a variety of devices and interfaces and are highly enthusiastic about doing so more frequently as the technology advances.
But those improvements are still rather avidly awaited, because voice technology has a tendency to go wrong, often in amusing ways.
One particularly scared Twitter user, for example, noted Alexa seemed to be carrying on a romance with the Google Assistant in their kitchen.
Although Google and Alexa are apparently surprisingly good at talking to each other, the most common complaint is that they’re not particularly good at talking to real people sometimes. One user asked Alexa to turn out the living room lights, which Alexa interpreted as a request play Bon Jovi’s 1986 classic “Livin’ On A Prayer” as a lullaby.
Another noted that when she asked Alexa to turn on her porch light, Alexa informed her she could not follow that request — because the user didn’t own a Porsche.
And while it might be easy — if perhaps a bit pointless — to get mad at Alexa, it really isn’t fair. Human beings are harder to talk to, particularly if you happen to be a machine, noted Pulse Labs Co-Founder and CEO Abhishek Suthan.
“People come from different backgrounds, cultures and languages, and what’s colloquial in one place means something totally different someplace else. Building those experiences, keeping the user in mind and making that experience personalized enough for them is the holy grail, which is what we’re trying to build,” Suthan said.
Artificial intelligence (AI), he noted, often grows “upside down” in this regard — with design being pegged around macro-use, instead of being personalized to an individual user.
It’s a mistake, Suthan said — one Pulse Labs hopes to improve by teaching AIs how to converse better with those who want to talk to them.
Building a Better Conversation Partner
Pulse Labs connects mobile app developers to app users — specifically users in the demographic group the voice app is targeting, allowing them to test drive talking to the app before it launches. Developers get data on what prompts work, which ones fall flat and what “unexpected” situations come up in the interactive journey.
The tiny firm — three people as of right now — got its start in the first ever Alexa Accelerator, and the company recently raked in $2.5 million in seed funding to begin building out its vision. Among the first things the funding is earmarked for will be expanding the size of the company past its three founders. (Dylan Zwick and Akansha Mehta co-founded the firm with Suthan.)
The company’s investor list is notable to say the least. Madrona Venture Group led the round, with participation from Techstars Ventures, the Amazon Alexa Fund and Bezos Expeditions.
“As one of the companies in the inaugural Alexa Accelerator class, we are excited to make a follow-on investment in Pulse Labs as the company continues to innovate in the area of skill testing,” Alexa Fund Director Paul Bernard said when the funds were first announced. “Pulse provides a great way for brands to understand user interaction and gain rich feedback to create engaging customer experiences.”
The goal is to make Alexa more adaptable and to help developers get ahead of future issues.
Pulse Labs wants to make it easier for Alexa — and later this year the Google Assistant — to go off script and be able to adapt more flexibly in real time to keep the conversation moving.
The power of voice-activated interactions, Suthan noted, comes from the ways in which it can make things feel effortless for consumers — a virtual assistant can shop, pay bills or change the music, among other commands.
But that power disappears if consumers feel they’re wasting too much time guessing how to prompt the system and thus stop using the app — something Pulse Labs is trying to avoid through its skill testing.