Category Archives: Data Loss Prevention

Gartner Analyst and It’s Time to Redefine Data Loss Prevention

A Gartner Analyst and "It's Time to Redefine Data Loss Prevention"

 

Today, it seems to be in vogue to criticize DLP solutions as out of date, insufficient for modern business needs, and generally out of touch with industry realities.

 

One of the more notable sources to voice this opinion has been none other than industry leader Gartner.

 

In an analysis piece entitled “It's Time to Redefine Data Loss Prevention” [1] Gartner goes after the most dominant trends in DLP.   The article asserts that security and risk management leaders need to shift from current trends in data loss protection and “implement a holistic data security governance strategy.”  This is the only way for IT departments to insure “data protection throughout the information life cycle.”

 

The Gartner write up lays out a nuanced, but ultimately damning case against contemporary DLP.   Note that GTB Technologies customer's were not part of the analysis as the report appears to be about "Gartner Market Leaders".

 

The summary of their argument looks something like this:

Despite a market awash in DLP solution options, organizations are still struggling with communication between data owners and those responsible for administering DLP systems.  A symptom of this disconnect is that managers are opting for programs that will automate the work of DLP. This has resulted according to Gartner in “technology-driven — rather than business-driven — implementations.”

 

Another problem says Gartner is that many DLP solution users struggle to get out of the initial phases of discovering and monitoring data flows after the platform is first deployed. The focus on these meticulous tasks means that organizations never realize the potential benefits of “deeper data analytics” or “applying appropriate data protections.”

 

Lastly, the article points out that DLP as a technology is viewed by users--whether they be individuals or enterprises--as a “high-maintenance tool”, requiring constant attention and a substantial regular investment of man hours.   This ultimately leads to “incomplete deployments” in relation to the systems actual DLP needs.   As a result of all of these phenomenon, says Gartner, companies end up being stuck with systems that require constant fine tuning, and struggle to calculate the ROI on the substantial investments in DLP platforms.

 

While all of the above points are fair criticisms of contemporary DLP, the approach offered up in the analysis to solve these problems are totally off the mark.   Gartner suggests a total shift in data loss management, moving away from reliance on technology, and instead “sharing responsibility” for DLP between the different constituents in an organization. To achieve better DLP, the industry does not need to run away from technology, but rather incorporate programs that will address the very real problems Gartner has laid out.

 

GTB’s Smart DLP that WorksTM is a platform designed to do just that.

Using patented artificial intelligence models, the GTB data loss prevention programs use an artificial intelligence based approach to manage sensitive data. This allows the platform to learn and map the network, freeing IT from the tedious maintenance attached to other solutions.    Due to the precision of it’s detection technology, ease of use and quick time to value,  Smart DLP allows processes to be streamlined, instead of bogging down administrators with errors and false positives.

With Smart DLP managers can have their cake and eat it too.   GTB ensures users that security does not come at the expense of efficiency.

[1] It's Time to Redefine Data Loss Prevention Published 19 September 2017 - ID G00333194  Gartner

Zero Trust Data Protection

Zero Trust Data Protection Out with the Old Conventional security models, those based on firewalls, IDS, and the like, operate on the outdated assumption that "everything on the inside of an organization’s network can be trusted".   The contemporary threat landscape facing IT has shown that this is simply not true.   The increased attack…
Read more

Badmouthing Data Loss Prevention (DLP) is Fashionable

Badmouthing Data Loss Prevention (DLP) is Fashionable

 

Is DLP Really Dead?

 

I recently came across several digital security vendor sites who describe themselves as a “DLP alternative.”

Perusing through their pages, I came across comments such as “DLP is hard to deploy”, “DLP is hard to maintain” and the classic: “DLP is heavy on the Endpoint”. It’s clear that these security vendors are trying to influence analysts by inserting these negative sentiments into the industry’s discourse on DLP.  Of course, terms such as “hard” or “heavy” are subjective at best and can’t be taken as a concrete, professional assessment.

But my real issue with remarks like these is their shallow understanding of Data Loss Prevention.

Vendors and analysts tend to do a mediocre job explaining what DLP actually is.   Most people treat DLP as a single, specific product. In reality, DLP is a set of tools designed to protect data in various states.    Here’s my definition of what DLP is: A DLP system performs real-time Data Classification on Data in Motion and of Data at Rest and enforces predefined security policies on such streams or data.

This definition also requires us to flesh out our terms.   “Data in Motion” means data on its way from a network to another destination, such as the internet, an external storage device (USB) or even to printers or to fax machines.    “Data at Rest” is data that resides in databases or any unstructured file anywhere on the network. “Data Classification” is the implementation of a DLP policy using specific markers--say, credit card or Social Security numbers for instance.    These policies allow a given transmission of data to be placed in a specific category such as PCI or HIPAA.

From the definition above one can see that DLP is not a single tool, but rather a set of content-aware tools that include a wide range of applications including Network DLP, Endpoint Device Controls, Application Controls, and Data Discovery.

So Which Part is Dead?  

Now that GDPR is in full effect it is hard to understand how Data Discovery is dead or even “seriously ill” as some observers have put it. One of the basic GDPR requirements is to inventory and classify data containing Personal Identifiable Information, or PII. Such data can reside in a wide range of storage areas including file-shares, cloud storage, or other in-house databases. Once the data are discovered, they need to be protected from dissemination to unauthorized entities. Far from being a thing of the past, DLP tools will play a vital role in achieving compliance with the most important set of data regulations ever to hit the world of information technology.

Today's DLP tools are designed mainly to protect PII.

This is a requirement of most data protection regulations in existence, such as PCI, HIPAA, CA1386, California Consumer Privacy Act of 2018, GLBA, GDPR,  NY DFS Cybersecurity, PDPA and SOX.   But protection isn’t as simple as guarding personal details stored on the network. Effective DLP requires a system capable of comprehensive Data Discovery. Achieving Data Discovery means understanding where all enterprise data is located, and to mitigate the risk of loss by various remedial actions such as:

  •         Changing Folder Security Permissions
  •         Moving/Coping the data to another secure folder
  •         Encryption
  •         Redacting images with sensitive data
  •         Enforcing Digital Rights Management
  •         Classification

In addition to these passive defense steps, DLP must also have ways of identifying threats  and protecting against attacks on a network.   Proprietary algorithms such as GTB’s artificial intelligent programs can identify even partial data matches , managers remain alert to any attempts at data exfiltration from  a malicious insider or malware.   Though inaccurate in detecting data exfiltration,  User / Entity Behavior Analytics (UEBA) together with intelligent DLP may be able to identify the presence of malicious programs on a system.   In this way,  systems, such as GTB's, address the insider threat as well, insuring that neither a company’s personnel nor its digital applications become the means for compromising data loss.

But here’s the million-dollar question: if DLP is so essential, why is it getting such a bad rap?

Let’s try to understand where this negative perception came from.

Here are some of the end user complaints as described by a Deloitte Survey entitled “DLP Pitfalls”:

  1.    “High volumes of false positives lead to DLP operations team frustration & inability to focus on true risks”
  2.    “Legitimate business processes are blocked”
  3.    “Frustration with the speed at which the DLP solution becomes functional”
  4.    “Unmanageable incident queues”

These complaints stem from the fact that most DLP vendors have mediocre detection capabilities.   This is because almost all systems use predefined data patterns, called templates, to locate and classify data on a system.  While templates are easy to define and use, they produce waves of false positives that make the system useless from a practical perspective.  Customers are left feeling they’ve bought an expensive toy rather than a system meant to secure their data. No wonder customers are frustrated by DLP capabilities or its value.

So, is it possible to solve the dreaded false positives dilemma produced by DLP systems?  

Fortunately, the answer is yes.

Using content fingerprinting of PII and defining multi-field detection policies, such as combining last name and account number markers within a certain proximity, hones in on specific data and whittles away at irrelevant files. Using this multi-tiered scheme, the system detects the actual data of the company rather than just a data pattern that may or may not be relevant and has been shown to reduce false positives to almost zero.

While some DLP vendors support content "fingerprinting", they do not promote this technique for a good reason.  The number of fingerprints produced can become so large that the system can crash, or at the very least slow down, the network.

But this is not true for all DLP systems. GTB's proprietary  fingerprinting technology allows customers to fingerprint up to 10 billion fields without network degradation.

And as for the concern DLP systems are “unmanageable” and hard to use?

I disagree with the premise.

Even the more sophisticated functions of a DLP system such as running a select statement from one PII table while defining a multi-column policy in another field, are actually quite simple.

In summary, DLP is not just a singular tool and not all DLP system are the same.

Contrary to the naysayers, the growth projections for the industry clearly show that DLP is not “seriously ill” and is definitely not dead.

DLP is Still Going Strong

A blog: For nearly two decades, deploying Data Loss Prevention (DLP) has been a staple for enterprises seeking to secure their data assets. Despite being such an important part of an information security strategy, some observers have recently begun to cast doubt on the future of DLP within the industry.   Insider Threat Management and  UEBA…
Read more