Tokenization

Tokenization

Tokenization is a process that replaces a high-value credential (e.g., a payment card primary account number (PAN), a Social Security number) with a surrogate value that is used in transactions in place of that credential. Tokenization can map the credential to a new value that is in a different format or that is similar to the format of the original high-value credential (e.g., a payment card PAN in the payments industry). In payments, the objective of tokenization is to remove account data from the payment environment and replace it with something that is useless outside of the environment in which the token was created. While tokenization is not a new concept, recent data breaches have increased awareness of the need to protect payment account credentials. Tokenization is one approach that can be used to safeguard payment credentials from being stolen and used for fraudulent transactions.

 There are different kinds of tokens and different ways to create them. A token can be merchant specific. It can be single use or multi-use. It can be stored and managed in the cloud, in a token vault, or at a merchant location. A token is created using a process defined by the token solution provider. Once a token has been created, it may be tied to a card on file, individual transaction, payment card, or device.

Two types of tokens are being used and/or defined in the payments industry

1.Tokens that will function in place of the actual PAN to perform a payment transaction 2.Tokens that replace the PAN and are stored by merchants and/or acquirers in place of actual PANs and used for other uses (e.g., for loyalty programs) 

PCI Tokenization Model:

 PCI SSC is currently developing security requirements for tokenization products (e.g., tokenization applications or appliances) that replace a PAN with a token. The tokenization processes described by PCI include functionality to exchange a token back to the original PAN (“detokenization”) as well as “irreversible” tokens for which there is no mechanism supported to reproduce the PAN. The goal of this effort is to remove the need to store PANs, thereby reducing the risk of unauthorized disclosure, and is focused on tokens used in the acquiring environment.

It is anticipated that use of secure tokenization products will help to minimize the locations, systems and networks where cardholder data is stored, processed or transmitted. A secure tokenization implementation may help minimize the retention of payment card data in an entity’s environment and hence simplify their PCI DSS compliance efforts.These tokenization security requirements are part of the Council’s ongoing work to provide standards and guidance on technologies that can improve cardholder data security along the payment transaction chain.

The PCI effort will provide tokenization product vendors and developers with detailed technical requirements for how to generate and store tokens securely. A mechanism to evaluate tokenization products against the requirements is under consideration.

PCI security requirements are developed with the input of the PCI community of participating organization members, security assessors, testing laboratories and other key stakeholders. In addition, PCI SSC has held conceptual and technical discussions with a number of organizations that already offer tokenization products or services. PCI SSC also liaises with X9 and EMVCo on their respective tokenization efforts. 

Summary:

Commercial acquiring tokenization solutions are currently available and in use by merchants to remove cardholder data from their business environment (e.g., for loyalty programs or card-on- le transactions).

Tokenization standards are also now being developed and published by a number of industry organizations, with commercial solutions starting to use those specications to provide tokenization services. Some standardization efforts are focused on data-at-rest, protecting data within a merchant’s environment, while other are focused on data-in-transit, protecting data throughout the trans- action process.

Tokenization standardization and broader implementation are evolving. The industry is starting to see alignment among the standardization efforts around the EMVCo tokenization specication. The EMVCo tokenization framework also references its use with EMV chip cards, combining the security bene ts of EMV chip with tokenization. Acquirers will also continue to offer tokenization solutions to merchants that address specifc merchant needs not otherwise addressed.

Thanks,

Karthick - Product Hunter.


To view or add a comment, sign in

More articles by Karthick Chandrasekar

Others also viewed

Explore content categories