WHAT IS TOKENIZATION

WHAT IS TOKENIZATION?

When applied to data security, Tokenization is the process of substituting a sensitive data element with a non-sensitive equivalent, referred to as a token, that has no extrinsic or exploitable meaning or value. The token is a reference that maps back to the sensitive data through a tokenization system.