Consider Tokenization to Secure Sensitive Data

Consider Tokenization

With the rising number of security breaches and hacks, it is better to avoid losses by identifying and protecting sensitive data from exposure

Tokenization is defined as substituting a sensitive data element with a non-sensitive equivalent (token) that has no extrinsic or exploitable meaning or value. The token must bear no resemblance to the data and the security of the token relies on the infeasibility of determining the original data by the resulting token. Tokenization may use cryptographic methods to create the token, but the resulting token is not ciphertext and is in the same format and length as the original data.

While most Tokenization projects are focused on payment systems, specifically credit card payments between customers, merchants, and banks, there are additional uses for Tokenization solutions other than to satisfy Payment Card Industry (PCI) standards. Companies can benefit from Tokenization products by tokenizing Personally Identifiable Information (PII) and any other sensitive information, protecting their customer data from exposure.

Why use tokens?

When tokens are used, the result is minimized exposure of sensitive data to accidental or unauthorized access. Tokens are stored in files and databases, instead of sensitive data.

Companies who are unfortunately hacked and have their data stolen can be assured that the tokenized data is worthless to the attacker.

Existing software applications can more easily operate using tokens, rather than expanding data fields and changing software to account for larger fields of encrypted data. Tokenization produces a token with the same character length and format as the input data. A real plus when dealing with existing software applications, saving time and money.

For employees who need to access sensitive data such as a social security number (SSN) for billing purposes or customer identity verification, tokenization products can either de-tokenize the sensitive data and reveal all or simply mask most of the original data and only reveal the last 4 characters, for example.

Protecting Tokenization Systems

 A critical component of protecting sensitive data is to ensure attackers cannot de-tokenize the tokens to access the original data, and that involves protecting the tokenization system itself. The risk reduction benefits of tokenization require that the tokenization system is logically isolated and segmented from data processing systems and applications that previously processed sensitive data replaced by tokens. Only the tokenization system can tokenize data to create tokens or detokenize back to redeem the original data. Tokenization systems may be operated in-house within a securely isolated segment of the data center or outsourced as a service from a secure token service provider.

The security of the entire system including sensitive data capture and authorization, tokenization methodology; storage, use, and subsequent access is dependent upon the customer’s own tokenization implementation.

Summary

Companies should consider using Tokenization solutions to protect their sensitive data. With the rising number of security breaches and hacks, it is better to avoid losses by identifying and protecting sensitive data from exposure.

Those considering Tokenization Solutions should ensure that these systems are Common Criteria and NIST FIP140 certified to ensure that the systems being evaluated have actually been cryptographically tested and assessed.

https://www.commoncriteriaportal.org/products/

http://csrc.nist.gov/groups/STM/cmvp/validation.html

Tokenization is simpler to use with existing software applications processing sensitive data, saving time and money by altering applications, files, and databases to use ciphertext. When combined with a secure implementation of an accredited solution, tokens can save a company and its customers’ data from exposure and theft.

Related Posts

About Us

Kaizen Approach helps government and commercial customers to strengthen their cybersecurity position and advance their workforce development.

Let’s Socialize

Popular Post