tokenization

Tokenization, when applied to data security, is the process of substituting a sensitive data element with a non-sensitive equivalent, referred to as a token, that has no extrinsic or exploitable meaning or value. The token is a reference that maps back to the sensitive data through a tokenization system.

EMV Cards To Top 575 Million In U.S. In 2015
EMV Cards To Top 575 Million In U.S. In 2015
August 14, 2014  |  Mastercard

More than 575 million U.S. payment cards will include EMV chips by the end of next year, according to a statement issued jointly from Visa...

READ MORE >