Data is one thing — a valuable commodity that makes information the oil of the digital age. However, keeping track of data in a way that enables secure storage, efficient access and analysis, compliance with regulations and — ultimately — its profitable use is quite another topic.
Those two extremes come together in a rising technology called multi-variant tokenization. At least, that’s the promise offered by the company that developed it: TokenEx. CEO Alex Pezold and Karen Webster recently dug deep into the issues of data and tokenization during a PYMNTS webinar entitled “Simplifying the Tokenization of Diverse Data.”
To understand the context of that digital discussion, corporations and other organizations have long collected and recorded data, of course — data (aka primary sources) is the foundation of history, after all. That drive continues: According to Pezold, “most insurance companies have about 110 data points on each person out there, even though we have no idea what they are storing,”
Thanks to more sophisticated and ubiquitous data collection efforts (really, go try to live “off the grid”— see what happens and try to make it through the winter), and the importance of all that information to retail, marketing and manufacturing (to say nothing of politics and governmental controls), data is “what’s powering the economy,” Pezold said. “Data is causing significant industry changes.”
However, power requires organization, and that’s a big part of the pitch behind multi-variant tokenization. That proprietary technology allows a company’s sensitive data to be identified, mapped and secured as it moves across various environments, providing continuous, unparalleled security, while ensuring an organization’s operations remain uninterrupted. As a result, these companies can generate higher returns while cutting costs and reducing compliance burdens at the same time.
Tokens Vs. Encryption
Pezold told Webster that tokenization tends to work best when applied to data undergoing transit (tokenization, of course, involves the process of replacing sensitive values with non-sensitive values called “tokens”). Encryption is better suited to “data at rest,” he said — though it can produce “blobs of cyber text,” which represents the opposite of data efficiency.
Look at it this way: Tokenization can, in a very short time, translate “30 million credit card numbers into 30 million tokens,” Pezold said. “With encryption, you will [end up with] far more cyber text than you started with,” which is only part of the problem, as one then has to figure out how to fit that data into the organization’s back-end data set technology and platforms. That said, the trend is for organizations “moving away from encryption because tokenization presents an easier solution for [protecting] data.”
Retail Use Case
So, what is the deal, really, with multi-variant tokenization? How does it work in the real world of global commerce?
Pezold could not name names, but during the PYMNTS webinar, he offered a use case involving a large global retailer client, one that employs customer relationship management (CRM), enterprise resource planning (ERP), loyalty offerings, eCommerce as well as other features and technologies. A customer of this particular retailer — well, an affluent shopper with access to extraordinarily quick means of transport — might, he said by way of example, “buy something [from this retailer] in Italy [one day], then buy something in New York in the next day, and then buy something in Singapore the day after that.”
Problems already potentially crop up.
Such a large, complex organization is likely to maintain different data sets and platforms for different business units and offices. Data residency rules and other compliance requirements, too, play a role in encouraging inefficient separation of all that customer information. Such a large retailer might have hired the services of multiple payment service providers — at the least, if one goes down due to technical, business or other issues, there is already a backup in place. However, that retailer uses multi-variant tokenization.
Simplicity From Complexity
Three different data systems have come under the unifying power of a single tokenization technology. The technology allows the reconciliation of data across those different sets in ways that promote linkage and what he called “referential integrity among systems.”
That technology enables the retailer to gain a clearer view of what the data is really saying, “making better sense of how buyers are buying,” Pezold said, as well as vendor relationships and other business areas. That retailer “now has complete visibility [into] what its customer base is doing throughout the world.” In addition, costs are reduced in compliance and other areas — complexity, after all, is usually not a means toward saving money, no matter the endeavor.
Merchants are looking for help with data, Pezold said, acknowledging that “multi-variant tokenization” is a “mouthful.” However, the importance of data will keep increasing, perhaps even surpassing the importance of oil to history and modern civilization (assuming that has not happened already). Oil, after all, would have remained a sticky, smelly fluid without proper distribution and other enablers. Similarly, data without efficient organization and solid security is nothing but numbers that have not reached their full economic potential.