Mastercard On Repairing The Data Trust Deficit — With Decency 

Can data be harvested … with decency?

The California Consumer Privacy Act looms, taking effect at the beginning of 2020. Across the pond, the General Data Protection Regulation (GDPR) took effect in May of 2018.

The laws protecting consumer privacy are changing the ways businesses collect consumers’ data, as they scramble to avoid the impacts and fines tied to breaches or non-authorized activities. Yet, the regulations are still evolving, and differ depending on where one looks.

Beyond the regulatory landscape, the principles governing how corporations gather and utilize consumer-specific information may be ripe for change.

To navigate the changing digital landscape, Mastercard said on Thursday (Oct. 24) that it has launched its Data Responsibility Imperative (DRI), which details a framework of “six data responsibilities.” The framework serves as an adjunct to regulatory compliance efforts, and can help companies navigate the social impact and accountability of their data collection processes.

Among those tenets: Companies should uphold best-in-class security and privacy practices; companies should explain, simply and clearly, how data is collected and used (the principle of transparency and control); and individuals should have the ability to control how their data is used.

The six responsibilities — which also include innovating to ensure that “individuals benefit from the use of their data through better experiences, products and services” — come in tandem with a Mastercard survey, finding that across more than 2,000 individuals surveyed globally, less than one third think companies do a good job of handling data.

JoAnn Stonier, chief data officer at Mastercard, said in an interview with Karen Webster, “We live in a world where we hear all the time that data is ‘the new oil,’ and that it is the raw material of innovation.”

The Decency Principle — And The Trust Deficit

Though information security and privacy are table stakes in the online world today, noted Stonier, data standards are lacking. It all boils down to one principle, she said: decency.

Stonier pointed out that the Mastercard survey shows a “trust deficit” when it comes to interactions between firms and individuals. If only 26 percent of individuals think companies are handling data effectively, and 90 percent of people think data privacy is important, it follows that they will feel their concerns are not being addressed.

The companies themselves show a bit of disconnect in how they view their data-centric practices. Roughly 50 percent of corporate executives surveyed said they do a good job in handling data.

Stonier noted that, when approaching how data can and should be used, companies should start with the premise that consumers own their personal data.

“Nobody should have better rights to your name than you,” she said. “And you should have the right to control and understand what happens to it, … and who should benefit from the use of it.”

Those basic principles must be at the center of how products, solutions and services are designed and brought to market in an ecosystem that includes not just Mastercard, but its vendors and partners, as well as the issuers and merchants, said Stonier. Those firms are the ones that collect the data that passes through Mastercard’s network as it processes tens of thousands of transactions per second.

Examples of adherence to those principles lie in concrete actions that leverage data in ethical and regulatory compliance, such as using multiple layers of security, tokenization and encryption to protect data. In other examples (in use through Mastercard and other firms), Stonier added, consumers are able to visit portals to see what personal information companies hold.

The DRI will guide best practices, even among widely differing business models — which, of course, rely on different types of consumer information to conduct operations, noted Stonier. The data used in fraud analytics, for instance, may be different from what is collected when determining a credit score or taking out a mortgage. Furthermore, the DRI will help companies explain — in a consistent way — what they do with data, how they do it and, most importantly, why they do it, all in ways that make sense to consumers.

“It’s when the data use doesn’t make sense to you, or you didn’t consent to it or weren’t told about it, that there’s a breakdown in trust,” Stonier said.

The drive for data integrity and standardized best practices is especially urgent, she added, as artificial intelligence (AI) and machine learning are increasingly deployed to collect and manage data. It’s important for the companies to understand data quality and algorithmic processes to minimize bias.

With the trust gap currently in place, said Stonier, organizations stand at a crossroads. Companies must be cognizant that there may be a broad spectrum of consumers using their goods and services — ranging, for example, from the tech-savvy 12-year-old (who is “the consumer of the future”) to the 83-year-old who may not be aware that data is being used at all. The conversation has shifted since earlier in the decade when, during a tenure at The Aspen Institute, Stonier said, tech innovators were relatively less concerned with data practices. Now, she explained, we’re at a different place in the internet economy, where individuals engage with commerce and companies in a more knowledgeable way.

“If we break the customer’s trust, we will break this economy that is based on data,” Stonier warned. “So, we need to change the conversation, and begin to share best practices.”