With PCI-DSS 2.0 compliance newly mandated and recent guidance on PCI DSS tokenization[i] this is an excellent time for merchants to review their compliance and PCI scope reduction strategies. One of the more common approaches to reducing PCI DSS Scope (and hence the cost of assessments and the associated costs of remediation) is to tokenize PAN data within the enterprise.
While this blog series focuses on tokenization within a retail environment, the approaches and results are equally applicable to any tier 1 or 2 merchant with a large investment in existing data centers.
Why Reduce PCI Scope?
Most, if not all, tier 1 and tier 2 merchants are already PCI compliant and view continuing compliance as a cost of doing business. Why would the merchant decide to change its IT infrastructure to reduce PCI scope?
To paraphrase the PCI DSS 2.0 Standard [ii], PCI DSS Scope may be defined as a set of all systems that store, transmit or otherwise have access to Personal Account Number (PAN) data. That is any system that accesses credit card data in any way (encrypted or not) is potentially within PCI Scope.
Since PCI Scope is the set of systems that must be evaluated by a Qualified Security Assessor (QSA) for compliance, the cost of an assessment is directly related to the size of the task (and therefore to the size of the PCI Scope).
The average assessment costs for Tier 1 merchants are $225,000 (with 10% exceeding $500,000 annually)[iii]. This only includes direct out of pocket fees to QSA organizations and does not include the time and resources that the merchant must apply to bring systems in line with the standard.
The largest and least predictable cost is in remediation of PCI inadequacies. As a result of the yearly assessment, the merchant often has a list of remediation activities and compensating controls that must be implemented in order to maintain compliance. Often these involve disrupting or upgrading existing systems or changing where in the network or on what physical servers systems may reside. The cost of this remediation often dwarfs the cost of the annual assessment and may be revisited every year.
As much of the standard is subjective and compliance is up to the discretion of the QSA, a determination of point in time compliance one year, is no guarantee of the same outcome in the following year (even with no or minimal changes to IT infrastructure).
So not only does a large PCI Scope mean a large assessment cost and potentially larger remediation costs, but additional risk to unplanned expenditures caused by IT disruption even for IT systems that change little between assessments.
In short, it is important to reduce scope as much as possible in order to reduce both ongoing costs and the risk of large, unplanned IT expenditures. One obvious strategy here is to reduce the risk as much as possible and ‘delete’ the data. Deleting the data may involve moving the problem to someone else, changing existing business processes to remove PANs, or relying on tokenization to shrink the PCI footprint.
Common Strategies for Reducing PCI DSS Scope
Common strategies for reducing PCI DSS Scope include the following:
- Outsourcing all credit card processing and credit card handling to another vendor.
- Eliminating all stored PAN data from the network.
- PCI DSS Tokenization
The first option is by far the best at reducing scope. If handled properly there is very little or no PCI DSS Scope left for the merchant. This approach is not always practical for a variety of reasons including that existing IT systems could not accommodate the change and it is often preferable for the customer to enter into new transactions without re-entering credit card data. Moreover, large merchants often can’t change existing business processes that rely on PAN data, as it may involve re-training or re-deploying existing personnel or even changing the way business is conducted, if PANs are used for analytics or in other business functions such as CRM.
The second strategy often cannot be applied uniformly due to the saved credit card problem as described above and often data warehousing applications need a unique identifier to track purchases from an existing customer (perhaps to identify profitable and unprofitable transactions and customers). Since not every customer will be a member of a loyalty program, often PAN data is used to track customer activity.
The rest of this posting will concentrate on PCI DSS Tokenization
What is PCI DSS Tokenization?
PCI DSS Tokenization is a means for protecting credit card data by substituting a different, non-encrypted value for a credit card number. Usually this takes the form of random number (with some of the first digits and ending digits preserved) that appears to back end systems to be a valid credit card number.
It is important that the random elements of the token (that is the digits that are not preserved from the original PAN) are not in any way derived from the actual credit card number. [iv] The random number is stored in a secure vault, which defines the mapping from the PAN to the token.
If this is accomplished properly, the following results occur:
1) Any breach of documents with tokens rather than actual credit card data is useless to an attacker as the attacker does not have access to the token vault which stores the mappings.
2) There is no offline attack vector for deriving a decryption key and therefore compromising tokens.
3) Systems that only touch tokens and not actual PAN data may be removed from PCI Scope and are therefore not susceptible to the direct costs or remediation exercises for PCI compliance.
4) Systems that are thus removed from scope may now have past remediations and compensating controls removed in order to free up MIPS for business processes or in order to delay or eliminate the need for costly hardware and software upgrades.
5) Contrasted with encryption, tokenization does not incur a large key management problem at each system that encrypts and decrypts data – key management is centralized to the operation and maintenance of the vault alone.
There are two primary reasons most merchants evaluate PCI DSS tokenization options.
1) To reduce the cost of PCI DSS compliance (as cost is directly related to scope)
2) To increase security and to drastically reduce the risk of a data breach.
Given these constraints, my colleagues and I are under the contention that the best option is often to begin at the data center where there is the most value gained with the least effort and then utilize this effort to inform the decision of how best to secure other parts of the enterprise.
In my next blog post, I’ll discuss three common architectural approaches towards data center tokenization.
While we continue to explore Tokenization, I encourage everyone to download a complimentary copy of PCI DSS Expert and QSA Walter Conway’s PCI DSS Tokenization Buyer’s guide available here
You are also welcome to peruse Intel’s solution for reducing PCI DSS scope by visiting the Intel Tokenization Broker landing page
Tom Burns serves in Intel’s Data Center Software group where he works with many of the world’s top retailers to help increase security and reduce PCI DSS Scope. Tom joined Intel in 2008 and holds a BSEE from Purdue University.
[i] Information Supplement: PCI DSS Tokenization Guidelines: PCI Counsel, August 2011
[ii] PCI DSS Requirements and Security Assessment Procedures, Version 2.0, Page 10 Section “Scope of Assessment for Compliance with PCI DSS Requirements”
[iii] PCI DSS Trends 2010: QSA Insights Report, Page 9 http://www.ponemon.org/local/upload/fckjail/generalcontent/18/file/PCI%20DSS%20Trends%20-%20QSA%20Insights%20010310.pdf
[iv] Information Supplement: PCI DSS Tokenization Guidelines: PCI Counsel, August 2011 section 4.1