tokenization

Tokenization, when applied to data security, is the process of substituting a sensitive data element with a non-sensitive equivalent, referred to as a token, that has no extrinsic or exploitable meaning or value. The token is a reference that maps back to the sensitive data through a tokenization system.

What The Apple ICloud Breach Means For Payment Security

What The Apple ICloud Breach Means For Payment Security
What The Apple ICloud Breach Means For Payment Security
September 03, 2014  |  Breach Round Up

Apple is investigating reports that a security hole in one of its online services allowed cyberthieves to download nude photos from the iCloud accounts of...

READ MORE >
EMV Cards To Top 575 Million In U.S. In 2015
EMV Cards To Top 575 Million In U.S. In 2015
August 14, 2014  |  Mastercard

More than 575 million U.S. payment cards will include EMV chips by the end of next year, according to a statement issued jointly from Visa...

READ MORE >