Big data, heavy data & DCAP

With information volumes surging and compliance burdens growing, enterprises need a new approach to data security

 

By 2025 humans will have generated 180 zettabytes of data [i]– or 180 trillion gigabytes. That’s a lot, and well up from the mere 10 zettabytes we’d created back in 2015.

Managing and protecting all that information has become a significant challenge. Surging data volumes include important business information that’s often stored in multiple repositories: different geographies, business units, departments, formats; and housed both on-premises and in the cloud.

Draconian regulations

Regulatory regimes like the EU’s General Data Protection Regulation (GDPR) and the coming California Consumer Privacy Act (CCPA) pile on the pressure by compelling organisations to rigorously catalog all the personally identifiable information they have on file and monitor it continuously for potential breaches.

Because of this companies have to submit to more data audits and assessments than ever before. Many struggle to provide the necessary proof that they have full visibility of critical data with appropriate protections in place.

There is also a trend to utilize greater and greater amounts of information from outside sources. Categorizing all third-party data and understanding its sensitivity before it enters company systems is critical.

With data so dispersed, repositories overflowing, audits and regulatory scrutiny on the rise, traditional methods of ensuring data security are quickly becoming unstuck.

Last year there were more than 6,500 publicly disclosed incidents of data loss affecting more than 5 billion records. Despite large and ongoing investment in cybersecurity, data assets are still vulnerable. But many companies today simply couldn’t even tell you where all their sensitive data is – particularly when it sits in unstructured formats or is housed in relational databases, data warehouse hardware, or big data sources located in-house and up in the cloud.

That lack of data-centric protection increases the risk of data breaches and compliance failures – both of which can extract a painful cost.

A better way to protect data

Organizations in every sector depend on the security, accuracy, and availability of their data to generate revenue. Data helps businesses better serve customers, boost productivity, understand the drivers behind business outcomes and plan for the future.

So data is now business critical. But if you don’t know where your prize assets are, you can’t protect them. With data breaches on the rise, every organization needs to look again at its approach to risk mitigation and ensure they have the tools and processes in place to deliver these core data protection capabilities:

  • Locating and classifying sensitive data regardless of where it sits in the organization.
  • Applying protection mechanisms to valuable and personally identifiable data in order to mitigate breaches.
  • Sustaining compliance with current data security and privacy regulations, including the ability to monitor data, user behavior, and report adverse behavior quickly.
  • Visualizing capabilities that allow users across the business to conduct analytics.
  • Reporting capabilities that ensure robust audit readiness.

The rise of DCAP

Gartner Research predicts[ii] that by 2020, 40 per cent of enterprises will have replaced the disparate and siloed data security tools currently in use with data-centric audit and protection (DCAP) products. DCAP tools deliver a centralized view of all at-risk data, allowing organizations to track their sensitive data and protect it in line with regulatory and risk management requirements.

DCAP is all about encouraging IT teams to focus on data, not the underlying information technology. While other data security approaches keep IT departments forever chasing potential threats, DCAP focuses on viewing, monitoring, and managing how users interact with high-risk data sets.

IT maintains responsibility for the installation, configuration and management of business apps and network infrastructure, but leaves responsibility for data with the people who really understand its value: the owners – those that created it, their departments, and their managers.

The five pillars

It’s five pillars enable companies to get to grips with their most sensitive information:

  • Classifying data across the IT estate, and implementing policies that can categorize files as they are created.
  • Controlling data and / or privileges from access to editing to blocking, with unique profiles for specialist users like system administrators and developers.
  • Reporting of user activity to detect suspicious behavior.
  • Tracking and Controlling security events as they occur, helping organizations to better understand where vulnerabilities may be hiding.
  • Centralizing data management through a dashboard that enables administrators to apply security policies quickly across the network.

While achieving all five can seem like a monumental challenge, the technology needed to execute the DCAP model is now well within reach – even for large enterprises. Companies are now taking advantage of the next generation of analytics tools to better understand where their data crown jewels reside, track who’s accessing them, and apply appropriate levels of protection.

With GTB's built-in machine learning and analytics, organizations can achieve a greater level of data insight and confidence that they are in compliance. They can also control how files are shared on popular collaboration and enterprise content management platforms like SharePoint.

GTB’s Data Protection that Workstm platform provides companies with a platform to implement DCAP across the entire enterprise. By streamlining data protection processes with artificially intelligent algorithms, GTB delivers the highest in data security assurance without impacting smooth business operations.

 

[i] 2016 IoT Midyear Review – The Report Card for Everyone, IDC, August 4, 2016.

[ii] Market Guide for Data-Centric Audit and Protection,” DCAP Gartner Research, March 21, 2017

Comments are closed.