tokenization

Tokenization, when applied to data security, is the process of substituting a sensitive data element with a non-sensitive equivalent, referred to as a token, that has no extrinsic or exploitable meaning or value. The token is a reference that maps back to the sensitive data through a tokenization system.
Taking Corporate Travel Payments Mobile
Taking Corporate Travel Payments Mobile
October 21, 2015  |  Mastercard

Striking a balance between instituting corporate controls and providing a frictionless user experience is an ongoing challenge in the travel and expense management market. Richard...

READ MORE >
The Tokenization And/Or Encryption Decision
The Tokenization And/Or Encryption Decision
September 16, 2015  |  Merchant Innovation

Protecting cardholder data at the point-of-sale is critical, with many merchants struggling with the choice of deploying tokenization or encryption. And throwing EMV into the...

READ MORE >