In a not recent, but still valid Gartner report, Using Tokenization to Reduce PCI compliance Requirements, it was found that large merchants with an average of 100,000 customer accounts potentially store cardholder data in 10-20 different locations in-house. Since the PCI standard mandates that every system in the Cardholder Data Environment (CDE) must be audited, this common scenario creates many potential vulnerabilities, since a lot of different locations have to be audited, which in turn requires more resources and time, resulting in higher costs.
So, is there a way to entirely eliminate the existence of cardholder data from the merchant environment in order to reduce the audit scope?
The answer to this question is yes, and the solution is Tokenization.
Tokenization simply replaces cardholder data with an “alias”, a separate random-generated value called a token. The sensitive data resides in a token cached inside a single database. Only the token values are stored locally when using applications or services. The process can then be reversed when token data needs to be translated into the original value, i.e., de-tokenization will replace a token with its associated clear text value.
Tokenization can be applied in several ways. It can be performed by in-house applications that are applied to databases and other sensitive data stores where the tokens themselves can be used for service transactions. Alternately, tokenization can be offered as a service providing cloud tokenization platforms that transform data into tokens and store the tokens, as well as the token mapping, within the service vendor’s local database.
With a tokenization solution outsourced via a SaaS model, cardholder data (CHD) never resides in the organization’s environment. The basic idea of encryption remains true: protect critical data with strong encryption algorithms wherever cardholder data is stored. However, tokenization moves that principle to a different level: protect cardholder data by entirely removing it from systems. Quite simply, organizations do not need to encrypt what they do not store. Someone else takes care of it.
Tokenization can offer organizations several possible benefits, including a chance to reduce the complexity of managing infrastructure and – particularly – encryption keys. It is very important to remember that encryption without an appropriate key management process is nothing; it is like using a very strong password for your laptop and writing it on a post-it stuck next to the keyboard. (The main topic of section 3 of the PCI DSS Standard is about encryption keys and the key management process.)
There are some fundamental differences between tokenization and encryption that influence security. With tokenization, the original data is totally separate from the generated tokens, while encryption generally maintains a relationship to the original clear-text data. Encrypted data security is also tied to the encryption algorithm and the keys employed to encrypt such data, which usually means a fixed output data length and structure. Tokens, on the other hand, can be generated in many different ways, which allows for output type and length to be changed and any relationship to the length of the original value being removed. Furthermore, with encryption one has to take the key management process into account, requiring work and money in order to abide by the section 3 requirements of the PCI DSS Standard.
Transferring cardholder data off premise eliminates those expenditures. The less data on site, the lower the cost to keep it secure. It will also reduce the complexity of a company’s PCI DSS audit, because the organization no longer stores cardholder data.
The bottom line is that “if you don’t have a compelling business need to store CHD, don’t store it.” I know that storing PAN might seem to make life easier. Perhaps because you process many refunds or you need to store credit cards for reoccurring customers. But the disadvantage is that electronically storing the PAN comes with a lot of requirements.
So, if you need to store the PAN, consider an alternative method.
Tokenization IS A GOOD solution.