Jun 15, 2021


3 min read
  • Organizations must safeguard personally identifiable information and ensure ethical use of data, which can be achieved through powerful data-protection techniques such as homomorphic encryption, privacy-preserving techniques, and Trusted Execution Environments (TEE).
  • TEEs provide a secure area of a processor to protect sensitive data and allow businesses to extract value, apply insights in real time, and predict outcomes that can accelerate business growth while keeping machine learning models and data secure within an isolated processing environment.

Innovation is at the center of today’s highly competitive digital economy. Yet unlocking the full value of data while keeping it secure can prove daunting. Over the last few years, consumers have become more vocal about the responsible use of their data, organizations are looking to share and pool data securely, and regulations are piling up. As a result, safeguarding personally identifiable information (PII) and ensuring that it is used in ethical ways is critical.

The good news is that powerful data-protection techniques are available to support advanced analytics, machine learning (ML), and various forms of artificial intelligence (AI). These technologies include homomorphic encryption, which allows computations to take place on encrypted data without revealing the content; and more sophisticated privacy-preserving techniques for AI, including data tokenization; and differential privacy or k-anonymity. It also includes the use of secure processing and new data-protection models for secure machine learning.‍

Locking Down Data‍

Make no mistake, a borderless data world filled with clouds, APIs, and the IoT has fundamentally changed the stakes. Extracting the full value of sensitive data presents growing risks—mainly that others can gain access inadvertently or through an attack or breach. As data travels through cloud containers and out to an ecosystem, these risks are magnified. Different security standards, different equipment, and more touchpoints can create chaos.

As a result, many organizations find themselves hesitant to share and pool data in ways that unlock greater value. For example, a group of airlines might benefit from pooling maintenance and repair data without revealing the specifics of the data. Banks can learn about fraud patterns by studying a larger set of data. Business partners might want to share data—say an airline with a credit card company—in order to understand customers better.

Understandably, most firms prefer to retain full control of data on premises. This makes it easier to enforce security policies and hold encryption keys that lock the data. The downside is that they leave much of the value of data on the table—completely untouched. But using data-security tools, it’s increasingly possible to have the best of both worlds: extracting greater value from data while keeping it locked down.‍

A Better Way‍

A starting point for navigating data security is to understand what options exist and how they can help. Advanced tools such as data tokenization, homomorphic encryption, and privacy-preserving techniques are indispensable. Alone or together, they can lock down many types of data and allow it to be used more widely and liberally.

It’s also wise to consider using Trusted Execution Environment (TEE), which rely on a secure area of a processor to guarantee that the code and data residing in it are kept confidential. This environment is particularly valuable for machine learning. With a TEE, it’s possible to have a high degree of confidence that sensitive data is fully protected in a shared environment. A TEE is also attractive because it complements encryption and other privacy protections.

A TEE allows businesses to extract value, apply insights in real time, and predict outcomes that can accelerate business growth. Organizations can share data and perform deep analytical tasks without exposing the data or the underlying algorithm. Operating on cleartext information inside a TEE can also increase the speed of computation compared with homomorphic encryption—while supporting scalability that approximates a cloud environment. All of this is possible—including keeping machine learning models and data secure within a TEE—because everything stays in an isolated processing environment. Applications and data execute without regard to the rest of the system.

It’s also wise to keep an eye on another emerging data-security tool, Federated AI Learning, which sends the algorithm to the device for processing rather than dumping the data into a cloud or database for central processing. Although this technology is only a few years old, it is advancing rapidly. Federated AI promises to revolutionize the data-science field. It permits businesses, research institutes, law enforcement, and government agencies to pool data and conduct machine learning jointly across organizations and geographies—without anyone seeing anyone else’s data.

To be sure, there are a growing array of data-protection tools that you should consider for your arsenal—particularly as you venture into machine learning and AI. As additional government regulations appear and public attitudes about security and privacy evolve, keeping trade secrets and PII secure isn’t just a good idea, it’s critical. We’re entering a new age of data protection and those with the right tools will prevail.

Recommended Next Read