PCI compliance is vitally important for businesses that process credit cards. The standards put forth by the PCI Security Standards Council – the PCI Data Security Standards – offer a framework for ensuring data security, and they must be followed in order to avoid fines and other penalties. Even worse, failure to comply with these standards can increase a company’s risk of suffering a data breach, which can result in substantial legal liabilities and loss of sales, not to mention irrevocable damage to a company’s reputation.
Particularly important for businesses is the “scope” of these standards. That scope applies to merchants and other stakeholders, including issuers and payments processors. Determining scope requires identifying the people, processes and technologies that interact with or could impact cardholders’ data security.
That’s why utilizing tokenization to eliminate the need to store sensitive data in the first place can reduce scope and simplify compliance, said John Noltensmeyer, TokenEx’s Head of Global Privacy and Compliance Solutions.
Noltensmeyer spoke with PYMNTS about a recent TokenEx eBook focusing on PCI controls and General Data Protection Regulation (GDPR) data privacy concerns. Understanding how to mitigate the risks associated with those types of data is essential, especially in an environment where breaches are so costly.
“In order to address the costs – both tangible and intangible – in becoming PCI compliant, the goal of any organization should be to reduce their scope as much as possible,” the executive told PYMNTS. “You want to have a very firm grasp on where that data is in your environment and, to the extent possible, keep it out of your environment. That is where things like tokenization come into play.”
In the process of tokenization, the credit card data – such as the card number – is replaced by non-sensitive data. Tokenization works across a variety of on-premise and outsourced solutions, but cloud tokenization offers firms a unique security strategy because it entirely removes sensitive data from the CDE, Noltensmeyer said.
In the outsourced model, the card data is submitted to a cloud tokenization provider, which returns tokens to customers. The only thing that enters the environment is the token itself. With cloud tokenization, businesses can reduce scope and its associated risk from their environments. That’s especially important when companies operate across multiple channels, such as brick and mortar, websites or call centers, Noltensmeyer said.
Similar to PCI, systems that capture PII (personally identifiable information) operating at contact points such as mobile devices, call centers or POS can redirect sensitive data to be tokenized, vaulted and processed through the third-party organization without being stored in the customer’s internal systems.
Tokens are also fully customizable, with the ability to match the types of data being secured – spanning not just payment card data, but any type of information, such as Social Security numbers or health-related data, Noltensmeyer said.
Gearing up for GDPR
Tokenization’s flexibility to protect disparate data types also allows companies to meet the requirements of the GDPR, Noltensmeyer told PYMNTS.
By using tokenization to comply with PCI DSS and GDPR, there is no need for two approaches, mindsets or technology deployments to satisfy data protection rules.
As has been well-documented, many companies were not prepared to meet the May 25 debut of the data security regulations. We are just starting to see the effects – financial and otherwise – on companies that have been breached in the past few months or have not met their obligations under the mandate.
GDPR opens the door to tokenization because it allows for the pseudonymization of data – i.e., processing information so it can no longer be attributed to a specific data subject without additional resources. That’s crucial in helping firms prepare for other mandates, such as the California Consumer Privacy Act.
TokenEx pseudonymizes personal data by replacing it with tokens and storing those tokens in vaults, effectively making tokenization and pseudonymization synonymous, Noltensmeyer said.
“The nice thing about PCI is that there is a very discrete dataset,” he said. “There’s only so much information on that credit card when it comes to personal information. The definition of what constitutes personal data under GDPR is very broad and includes everything from names to identification numbers, all the way down to the IP addresses. … But if you use tokenization the right way, you can protect these data sets.”
To download the TokenEx Compliance and Solutions Booklet fill out the form below: