The company that offered the first tokenization-in-a-box solution pushes for industry standards for an emerging technology.
Security is becoming more complicated with the increased oversight of industry and government, and with this complexity companies are seeing significant additional costs—tens, or even hundreds of thousands of dollars in auditing costs alone, not to mention new security system solutions and the personnel to set up and maintain them.
While the IT industry is becoming sophisticated about encryption and key management, many smaller companies are wary of jumping into encryption because of the additional effort and risks involved. If you don't guard your keys, it's as if you never encrypted your data; if you lose your keys, you have essentially erased your data. IT administrators have to wonder when they dive into this area, am I being paid enough to assume this added responsibility?
Nevertheless, for companies that accept credit cards, the Payment Card Industry Data Security Standard (PCI DSS) requires that if you store credit card numbers, you must encrypt them. The rule is quite explicit. However, following such rules isn't always as easy as it sounds. Encrypting credit card or social security numbers can take up more space in a database, and encrypted information may now not fit the original field size.
Humans being what they are, people have sometimes tried to cut corners or have simply overlooked gaps in their security. Enter the auditors, whose job it is to certify that companies handling personal information are doing it in a manner that is responsible and secure. Of course, auditors don't work for free; they don't even work cheap. Auditors cost a lot of money, honey, and the longer they hang around your shop, the more they're going to bill. If yours is a distributed organization, and you've got credit card information stashed all over the place, the auditor is going to want to check every system where that data is stored. Cha-ching! That means more money for every whistle stop along the tracks.
How the heck can we reduce the costs of these pricey audits? One answer is by using tokens. The reason is that, when you substitute tokens for, say, credit card numbers on individual systems, you have reduced the number of places where you are actually storing secure data. The credit card numbers are stored in a single vault. The many servers out there in the hinterland are storing meaningless tokens. Servers storing only tokens are not storing secure data because tokens have no value. If a server is not storing secure data, it is no longer within the scope of the auditor! Now your audit is going to take less time, and therefore it's going to cost less to complete. Suddenly, the value of tokenization in an encryption scenario takes on monetary significance, particularly for larger, distributed organizations.
"The biggest benefit in tokenization is reducing exposure to that yearly audit," says John Pescatore, vice president and distinguished analyst at Gartner. "If you think of a scenario where a company stores the credit card number in four different servers, or four different applications, the first thing the Payment Card Industry standards say is, if you're going to store card data, you have to encrypt it. Anywhere you do store that card data is subject to the yearly audit and has to meet the 240-odd requirements of the DSS [Data Security Standard]."
Pescatore says that tokenization is not necessarily a complete solution to the challenges of encryption, but it has great appeal in certain situations. The tokens can match the original data size in the database, and tokenization can reduce the scope of audits, important particularly where audits are proving costly.
"If you tokenize, you're now storing the security data in one place, and [in] those other four places, there is just this token value; the servers storing the tokens are not within the scope of the audit," he says. This has resulted in savings of hundreds of thousands of dollars to early adopters, according to nuBridges, one of the first proponents of commercialized tokenization solutions.
nuBridges was not the first to use tokens in lieu of encrypted data, but, says Gary Palgon, vice president of product management at nuBridges, "We were the first ones a year ago to introduce the only commercial, off-the-shelf solution for an enterprise." Since then, several other companies have come up with boxed solutions, and the industry is emerging along different lines. Not only are vendors introducing tokenization solutions, but companies are attempting to build their own token solutions in-house. With an increasing number of artists, the canvas is starting to get a bit muddy in the minds of some.
Notes Pescatore, a set of standards says how to test encryption and how to do it right. Tokenization does not have such standards yet. There is nothing against which to compare it in order to ensure it's done correctly, he says. The lack of standards is becoming the source of questions from companies that might otherwise benefit from tokenization, and the absence of standards may be actually creating security issues where implementation is not being done well, according to Palgon.
"In one example that we see all the time," says Palgon, "the way they generate tokens is that they add one [digit]. So the first one is 1, the second one is 2, and the next one is 3. That's OK except in the case…of the healthcare industry…they're actually creating credit cards one after the other, generating the numbers for your health services account…. If they're sequential, then you could probably figure out what the pattern is for the token, and therefore, if you get one match from a token to a credit card, you could figure out what all of the following credit cards are."
The solution to these types of issues and to the myriad questions companies have about the utility and security of tokens is to devise industry standards around tokenization. Palgon believes that there is a need to create tokenization standards similar to the industry standards that exist for encryption, which most people now accept at face value. The problem is there are no standards as of yet and no formal organization working on them. Meanwhile, independent companies are starting to devise their own unique ways of implementing tokens, some of them better than others. Even if all solutions were secure, a variety of proprietary standards makes the move to tokenization unwelcome to larger companies that may not want to get locked in to a single vendor's way of doing things for fear they might want to switch vendors down the road.
To address the lack of standards in the tokenization arena, nuBridges is proposing a Tokenization Standards Organization and is actively contacting all the vendors in the field to encourage them to consider coming together to develop a set of universally acceptable specifications on tokenization. At the recent RSA Conference 2010 in San Francisco last March, nuBridges announced it would support an industry-wide tokenization data security model. So far, response has been good, says Palgon, and the company is devoting a significant amount of effort to bring vendors together that heretofore have perceived each other as competitors.
The company also has introduced a tokenization partnership program in which third parties—such as point-of-sale vendors and loss prevention managers—sign up to have nuBridges educate them and help them upgrade any existing applications they may be running currently. "We're doing education to third parties and giving them a sandbox where they can create tokens and test their application," says Palgon.
While tokenization is not the answer to all encryption issues—you can't tokenize a scanned document with credit card information on it—it clearly is an emerging trend in the field of encryption and one that undoubtedly will mature in the coming year. Not only are there technical benefits, but the financial payback from reducing the scope of expensive audits makes it a path that IT will surely be following as time and resources permit.
For those who would like more information about tokenization or would like to participate in nuBridges Tokenization Standards Group, they can click here, call the company at 866.830.3600, or check out the You Tube video explaining what tokenization is.