Managing the attack surface of your data, not your infrastructure

As data moves offsite into public cloud services or service provider sites, so data protection strategy will have to change

Managing a company's exposure to risk is an everyday task for IT security professionals. This is often based on knowing the "attack surface" that the company has.

Your attack surface is the aggregate total of all your vulnerabilities: known, unknown and potential vulnerabilities that might exist in your company's software, hardware, firmware and network assets.

Based on this, it's possible to assess and manage the risk of an attack being successful. Reducing the number of different assets, applications and devices cuts the total attack surface. After all, fewer potential vulnerabilities in things such as software should mean reduced risk of exposure over time.

The concept of attack surface is normally divided into sub-categories of software attack surface, network attack surface and human attack surface. From an infrastructure perspective, this makes a lot of sense. We harden our software during development and patch it after the fact. IT security teams deploy firewalls and a host of other network-based solutions to harden the perimeter. Employees across the business will receive awareness training and regular reminders about threats.

However, this approach misses out how IT strategies have changed over the past few years. The growth of mobile and edge computing means that as much as 40 per cent of enterprise data never hits the corporate data centre. For all the benefits it delivers, the attack surface approach does not currently look at the data that people create.

Taking a different approach

Looking at the data attack surface is a different approach to analysing a company's overall attack surface and its constituent parts. Data can be created and hosted on any asset, within cloud applications or on standard end-points; similarly, data hosted as part of a cloud application may not be considered in its own right.

From examining the data attack surface, the concept often gets lost in the data classification space. Data classification is tantamount to a "grand challenge" for security organisations and usually ends up being way too complex for anyone to understand or implement. Well-meaning security and compliance teams expect end users to manually implement a labelling scheme of five to seven categories to Word docs, PowerPoint presentations, and the like.

Huge amounts of effort are spent training end users to be able to apply category labels to data they are creating as and when they need to make new documents or files. In this scenario, if end users don't understand how and when to apply these labels, they will just ignore it. Let's try a simpler approach.

How to attack the problem

The key to understanding your organisation's data attack surface is to ask some basic questions:

1) Do I know where all my data lives?
2) Can I get visibility into all those places?
3) What kind of security analysis can I perform on that data?

Taking a data-centric approach to security can help with compliance challenges too. For example, the EU's forthcoming General Data Protection Regulation (GDPR) will enforce greater accountability around personally identifiable information (PII) held on customers. Each and every set of customer PII data should be protected adequately.

To meet this requirement, it is important to understand where data exists currently and how this might change over time. This will require an audit of all places where data is currently saved. The growth of shadow IT and departmental level cloud applications means that there may be some surprises out there.

This audit activity should provide you with a full overview of how IT services are used in practice and how employees really store their data. Once you have that overview, you can shift focus from reactive protection to proactive analysis and monitoring of data-centric threats.

The purpose of this proactive analysis is twofold. Firstly, it's important to know when and how to adjust your security controls over time. Secondly, it's critical to understand data in context. This background should help you see how much of this data is subject to compliance regulations and check that data is being protected adequately.

Data gravity and changing protection models

As data moves offsite into public cloud services, cloud applications or service provider sites, so your data protection strategy will have to change too.

However, moving over to cloud can help ensure that the whole data attack surface is tracked over time. Subsequently, telemetry on application data and usage can be used to manage security efforts too. For example, if most data originates from mobile devices, it perhaps makes sense to shift security resources to protect that vector.

This ability to get more insight into where data is being created over time can help security planning. By combining the theory of data attack surface into wider security discussions, you should be able to keep up with how IT assets are being used in practice.

Drew Nielsen is director of enterprise security at Druva

Computing's Enterprise Security & Risk Management Summit returns on 24 November. Entrance is FREE to qualifying IT leaders and computing professionals, but places are going fast, so register now.