There’s been a surge in organisations attempting to develop a big data strategy in recent years, with businesses keen to mine any and all information they collect.
The rush for profits above all else, however, often means that enterprises don’t put the necessary thought into the privacy implications of their data strategy. Much like the NSA’s surveillance programmes, a lot of information is being collected and stored by firms just because they’re able to do it without facing any real consequences or opposition.
This creates a number of concerns around what data is collected, how it should be stored and interrogated, and what should be done about organisations which go too far – if we can agree on what “too far” means.
All of this was discussed by a panel of experts at Computing’s Big Data Summit recently. According to Martyn Croft, CIO at The Salvation Army, the fact that organisations are examining ways to exploit big data without thinking about the privacy implications of collecting and storing it could prove to be a huge problem. Croft used the example of The Salvation Army to illustrate his point. He explained that information today is cheap, and that leads organisations to attach little value to it.
The danger is that if the information isn’t valued by the firm, then it will have little incentive to protect it properly.
But while an organisation might not value an individual dataset, the individual to whom that information pertains usually will. “For my organisation, much of that data is about an individual and I think if it’s about an individual, you have a duty to take care of that data and take care over how you try to transform it and use it,” he said.
That view, however, seems to go against the grain, with many organisations content to scoop up all they can, without any policies in place as to how it’ll be used – a strategy that Robert Bond, head of data protection and information security at law firm Speechly Bircham argued is risky.
“The idea that we could do so much with this data so we’ll just suck up and keep as much as we can for as long as we can, because you never know when you might need to use it, is highly dangerous,” he said, explaining how it adds to the risk of a costly data breach.
“Because the more data you’ve got the more likely you are to lose it or to be hacked. The law in various countries including here states you can’t keep data longer than is necessary,” Bond continued.
“If you don’t have a plan as to why you are keeping that data for that period, when something goes wrong you are going to get hammered that much harder by the regulators.”
Bond suggested one remedy is to encrypt data, while having a data governance policy would also be helpful as “God helps those who help themselves”, with the regulators likely to be less harsh if there is some sort of policy in place.
But while there are ways businesses can attempt to protect the privacy of those whose data they collect – through anonymisation, for example – Barry Coatesworth, information security officer for retailer New Look, believes that the nature of big data means that the information collected will never truly be anonymous.
“I don’t think there’s going to be any silver bullet that’s going to keep aggregated data anonymous in three different sources, because bringing it all together could identify an individual,” he said. “It’s always going to happen, the more sources of data you have, the less chance you have of being anonymous.”
If individuals are uncomfortable with the idea that organisations might collect and store data about them, it’s theoretically possible for them to opt out by not engaging with those firms. However, with so many companies harnessing data, Coatesworth argued the only alternative to handing it over would be akin to living as a hermit.
“I think we all have a choice, but it’s fairly limited,” he said. “We could lock ourselves away and be a hermit, but we don’t want to do that so there is a choice, but it’s about whether you want to engage with society as a whole. You have a choice but it’s so restricted it would be to have no life.”
But according to Bond, people can influence how their data is handled if they come together and push for change, giving a privacy row involving Facebook as an example.
“With Facebook a couple of years back, they unilaterally changed their terms and conditions of business and effectively crowdsourcing said ‘we don’t agree’ and they had to change it. Together we can do lots but as individuals it’s hard,” he said.
However, information governance manager at health insurer Bupa Darren Myles argued that terms and conditions can rapidly become obsolete, and that the nature of big data means there’s really nothing specific consumers can be asked to consent to.
“The aim with big data is to collect as much data as you can and stick it with other data you’ve got. So the current approach of getting consent and understanding, those are the things that potentially are outmoded by big data and what it’s trying to achieve,” he said, arguing that it’s up to the organisation to be ethical in how they use that data.
“The idea you can get consent for use when big data is about answering questions you never thought of asking, just isn’t practical, you can’t agree. The emphasis is therefore on the organisations to be responsible in their use,” Myles said.
“When it comes to a conscious decision of the organisation, their efforts and what they choose to do with that data compared to what they believe their customers expect, that’s the extra dimension we have to consider.”
But in the business world, a firm will make decisions based on the potential impact to the bottom line, so if all of their rivals are making gains through exploiting big data, then a business can’t really afford to take a moral stance to data collection if it’s going to leave it behind the competition.
“If there’s a situation where there’s six players in the marketplace, you’re one of them, and the others are all saying let’s do it one way, and you’re saying let’s do it this way, that’s a challenge,” said Myles. “Are you going to do what everybody else is doing, making a commercial decision, or do you make an ethical decision?” he asked.
It’s very apparent that some of the largest corporations – with Google a prime example – don’t see this as a decision at all. They’re more happy to scoop up every bit of available data about individuals in order to improve their advertising platforms, and ultimately make more money. That currently presents a problem, because as Bond pointed out, the punishments dished out by regulators don’t make a dent in their profits, so they’re not going to think twice.
“The size of the [potential] fine is sadly driven by not privacy regulators but by banking and financial services regulators. And the data protection regulators don’t have the right to fine in the UK more than £500,000. Google makes that in a nanosecond so that’s why they’ll take a risk-based approach as to what’s happening with your data,” he said.
Bond, however, pointed out that this might soon change, thanks to upcoming data protection regulations from the European Commission, which could see organisations fined vast amounts for misusing data.
“It’s is going to change in the new regime. Egregious misuse could lead to a fine of up to five per cent of global annual turnover. That’s eye-watering stuff,” he said.
For a company like Google – to whom half a million represents a paltry sum – a fine could potentially cost them hundreds of millions, a figure that might force it to rethink its data policies.
Until we reach that point, however, we should expect organisations to continue to put the bottom line ahead of ethics as they scoop up every piece of data they can. And until the new regulations come into force, it’s up to individuals to be as careful as they can when sharing information with firms.
This paper seeks to provide education and technical insight to beacons, in addition to providing insight to Apple's iBeacon specification
This Dummies white paper will help you better understand business process management (BPM)