At the Teradata Partners Conference 2015 in Anaheim, Protegrity held multiple open forums to ask questions about data security in Teradata and how Protegrity helps the world's most successful organizations protect their databases, big data and analytics platforms.
"Working within the environment, how does PTY operate within the Teradata ecosystem?"
Protegrity integrates at a low level with all Teradata platforms and corresponding data protectors. Once fully implemented, the data protection is provided relatively transparent to authorized users.
"What’s the performance impact of a Protegrity data security implementation?"
Performance impact is a rather complex question since there are so many variables that can impact performance. When used in the manner the software was designed and intended, the overall system performance impact will barely be measurable.
This is true for several reasons. First, typically only 1% to perhaps a max of 3% of the total data columns across all tables in a database are candidates for protection (encryption or tokenization). Second, typically 80% to 90% of the access to the protected data only requires access to the data in its protected form. Therefore only 10% to 20% of the access to 1% or 2% of the data involves any additional process overhead at all.
"Does Protegrity support data moving from a data lake to Teradata?"
Protegrity fully supports all normal data migration/import/export/sharing/copying of data between databases or other data processing platforms like Hadoop. In most cases the same data sharing will occur without any performance or operational impact. If real 9 digit integer SSN values are used today, then randomly generated 9 digit integer token representations of the real SSN field will be used in their place to support the vast majority of users and business processes. Protegrity also fully supports Teradata QueryGrid technology.
"Can you give us an example of detokenizing at the UI level?"
Protegrity API calls function in much the same way as User Defined Function (UDF) calls in Teradata or Hadoop. They will first check whether or not the user calling the function is in policy (allowed or authorized to protect or unprotect) the data, then performs the function and log the activity.
"What are the different protection methods available?"
Encryption, Vaultless Tokenization, masking (static and dynamic), Format-Preserving Encryption or DTP (Data-Type Preservation) and activity monitoring/logging only (without applying any actual data security).
"Where are the keys stored?"
The Data Encryption Keys (DEK) and token codes books are stored in a secure repository that is protected by a key chain. A Repository Key is used to encrypt the small database containing all of the DEK's. The Repository Key is then encrypted with a Master Key. The Master Key can be protected using a HSM at the top of the key chain if desired or required.
"What are the benefits of Vaultless Tokenization vs. encryption?"
Protegrity Vaultless Tokenization is significantly more flexible and easier to operate and manage than traditional vault-based tokenization for many reasons. First, the original protected values never need to exist anywhere in clear text, not even in a 2 column lookup table. Second, Vaultless Tokenization uses static (fixed) token lookup "codebooks" or tables that are managed much like encryption keys. They are small, easier to distribute, fast to access, and do not change or need to be updated when they are used to protect a new value.
"Our internal company policy approves encryption, how do we get the policy to treat tokenization the same as encryption?"
Usually to have an in-depth technical discussion about how Protegrity tokenization works with the person or group requiring encryption to demonstrate how it can actually be harder to launch a brute force against relative to even AES-256 bit encryption. Once decision makers fully understand the technology it becomes obvious why even credit card issuers, government agencies and global banks are all utilizing the technology alongside encryption.
Tokenization is recognized by standards bodies like PCI and is quickly becoming an approved standard by other standards bodies such as NIST. Protegrity of course offers the option to choose whatever data protection method is most appropriate for various situations including encryption, tokenization, masking, FPE/DTP encryption and data obfuscation.
"What risks does adding Protegrity data security mitigate?"
Protegrity eliminates a wide range of risks and threat/attack vectors. An organization’s sensitive data exposure surface is dramatically reduced by protecting the data at-rest, in-transit and in-use with a single, centralized, enterprise-wide solution that works consistently across all platforms. This means fewer keys to manage, fewer opportunities for keys to be compromised, and significantly fewer instances where sensitive data is ever exposed.
The vast majority of access to the data will be in its protected form. This will support most business processes that only need to maintain referential integrity but do not need the actual clear text values.