Test
BLOG

Recognizing the Threat from and Warding off Data Exfiltration

January 5, 2022
Share on:

Data exfiltration presents a clear and present danger for businesses as they grapple with increasingly sophisticated cyberattacks. But are organizations powerless in the face of this highly charged security landscape? Certainly not. Data de-identification is a highly effective method for businesses to protect their sensitive data in the event of a breach, without compromising their ability to use and analyze data for business value.

Organizations cite concerns in their ability to prevent data exfiltration

A recent Blackfog survey of 225 cybersecurity professionals found that respondents are—despite heavy investments in security tools—increasingly concerned about their abilities to prevent data exfiltration. According to the report, less than half (43 percent) are confident they can prevent cyber criminals from stealing data. Almost two-thirds (68 percent) believe their existing data loss prevention (DLP) solutions are 'difficult to configure and maintain', and 51 percent report their solutions are 'just incompetent.'

In a year marked by successful high-profile breaches (SolarWinds, Colonial Pipeline) and increased security vulnerabilities due to a remote-centric, turnover-heavy workplace, these findings should be worrisome for all stakeholders involved in running an organization. With cybercriminals continuing to deploy sophisticated social-engineering-oriented phishing attacks and devising new strategies for ransomware campaigns, the threat of data exfiltration is here to stay.

Preventing data exfiltration in the event of a cyberattack

Data exfiltration is an end game with many starting points. A variety of tools and techniques can and will have been employed by attackers to achieve the objective of the theft or unauthorized removal of data from a device. The best—and really, only—way to stop data exfiltration is to deploy a multi-layered defense.

When protecting data at rest, organizations typically safeguard their perimeter with endpoint protection and/or anti-phishing software. The next layer of defense is usually data loss tools that monitor data exfiltration activities and encryption tools that render stolen data useless.

You've solved data at rest protection, and that's the end of the story, right? Think again. Along with being stored in databases and applications, data is also kinetic. It's moving constantly, and this movement introduces new opportunities to gain unauthorized access to data. One strategy for protecting data in motion is to deploy encrypted connections, such as transport layer security (TLS) solutions. However, this isn't a foolproof method, as it's typically reserved for connections to the outside world—leaving internal network communications vulnerable to man-in-the-middle or insider attacks. When sensitive data like personally identifiable information (PII) and payment card information (PCI) needs to be protected, tokenization is ideal.

The most effective technologies, policies, and processes

No security tool is perfect. All of them have a failure rate, which is why the industry gold standard is to layer them. This is done under the presumptions that a) a breach will occur and b) the attackers will gain a foothold. It's fair to say, though, that some tools carry more weight than others. A tool that renders data unusable by unauthorized viewers, such as encryption or tokenization, is a tremendously effective deterrent because it strips data of its value. In particular, vaultless tokenization is especially powerful because it offers high security, flexibility, and central control.

It's also important to remember that not all sensitive and proprietary data is in a table in a database. We wouldn't have to look very hard to find files or unstructured text with sensitive customer and employee data, as an example. Organizations should implement strong file encryption technology that's flexible to handle the litany of types and stores of data in play.

When it comes to ideal policies, centralizing data access policy will make organizations more efficient at distributing access to data while establishing a path for auditing and compliance work. From a processes standpoint, de-identifying data with tokens is also a flexible solution for privacy and PCI industry requirements. De-identifying data about people (e.g., PII) removes the ability to identify the subject of the data. In other words, humans cannot tell what the data is or who it is about. With tokens, machines can still perform analytical queries and deliver data to applications in a protected state.

Moving past the status quo

Traditional DLP tools typically lock down data and render it unusable for analytical purposes. This presents obvious drawbacks, which is why organizations should consider a new model for data protection that is strategically designed to empower data sharing. Data de-identification, mentioned earlier, is one strategy. With data de-identification, the default state guarantees the reader cannot understand what the data says. Data is stored as tokens and can be moved, analyzed, and reidentified only when necessary. This 'need to share' model for data keeps data secure while still offering access to the right people. Organizations get security and retain business value. In today's environment, it's a rare win-win all around.

DOWNLOAD HERE
< Back to The RESOURCE CENTER