For Retailers, Tackling The Data/Time Conundrum

big data time clock

Data is useful only when it’s fully utilized.

Put a bit more simply: Retailers face both a data problem and a time problem.

There’s too much data and not enough time.

In an interview with PYMNTS, Nathan Trueblood, VP of Product Management at DataTorrent, said customer expectations of retailers are changing — adding both challenge and opportunity to that data/time conundrum.

Where once individuals might have shied away from providing personal information beyond, say, an address and a telephone number, nowadays, “a customer somewhat expects that they’re going to be known by their provider. By that I mean when you’re on the web, you might be interacting [with] a retailer and switch over to the mobile app and then walk into the store and do something at the point of sale. That’s the true notion of omnichannel … the customer expects you to know who they are and [to] use the data collected across all … those channels,” Trueblood said.

But the onus is on the retailer to use that data in a productive and meaningful way, he emphasized, and to make the consumer experience more effective either by showing relevant recommendations or easing time to fulfillment.

“That’s where there’s a ton of innovation being applied in machine learning and artificial intelligence to essentially make … more personalized and data-driven recommendations and experiences to consumers. But you … also [have] to protect the consumer,” he said, which means treating the integrity and the security of the data in high regard.

Retailers must be mindful that omnichannel conduits offer more ways for hackers to commit fraud unless point-of-sale systems, web and mobile are all interconnected, Trueblood said. Consider the fraudster who buys something but picks it up in-store, moving beyond the confines of card-not-present fraud.

Against that backdrop, retailers must consider the sheer volume of data available on transactions.

“What we’re seeing with all of our customers for fraud prevention or risk management is that it is a big challenge. They choose DataTorrent because we can handle the data volumes in a fault-tolerant way and produce insight from massive amounts of transactions — millions of transactions a second. We can produce an answer about what transactions may be fraudulent in a very short period of time,” Trueblood said.

“Streaming applications … [are] used … to either help drive revenue for the business or save money,” he continued. “Raising revenue would be using that data to provide personalized recommendations based on what a customer is doing now versus what we know about them historically. Saving money would come from preventing fraud.”

Machine learning and artificial intelligence, applied in tandem, utilize that data history so when a new event happens, businesses can make a prediction about whether it’s something they should be taking a closer look at or stopping it from occurring.

Communication between teams within an organization remains key, he said.

“You might have a team within an enterprise that’s focused on detecting and preventing account hacking. That team may be totally separate from the team that deals with online transaction fraud, but they share a characteristic, which is, they’re dealing with a huge amount of data. They have to make decisions in a short period of time … but once they get that application in place and they have figured out how to handle that data volume and enrich the events, there’s a real opportunity to add additional use cases quickly and to have these different use cases talk to each other,” Trueblood said.

“There’s kind of a network effect that happens among these applications,” said the executive, noting that his own firm’s recent RTS 3.10 platform release has capabilities utilizing its Apoxi™ framework to glue independent applications together so they can share information, making them more effective at uncovering insights and triggering action, whether it be for preventing payment fraud, online account takeover or enhancing the customer experience through product recommendations.

Thus, speed matters in that it can mean the difference between detection and prevention, he surmised. That’s the difference between delving into information and analyzing it after the fact and analyzing it in real time.

“It’s kind of like saying, ‘Well, you know, they broke into the house and stole stuff … [we’ll] change the locks for next time.’ You’d much rather know that they’re inside your house and just stop them and call the police right away,” Trueblood said.

When asked about a near-term roadmap, the executive said, “You’ll see more applications that illustrate how a business can be more effective and more successful and frankly more competitive by producing business insights close to the time when things happen. We harness open-source innovation; we make it enterprise-grade.”

Customers, he said, don’t have 18 months to develop and deploy an application. They need to get these things up and running in a quarter or two. And then there may be another cycle of innovation in six to 12 months.

“A big trend we’re going to see over the next few years is not just a discussion about all the new techniques of artificial intelligence, but it will be all the ways that it’s starting to be applied in production. I think we’re still very much in the early days of customers figuring out how to actually deploy … all this cool stuff that’s happening in artificial intelligence,” he said.