How can businesses avoid the pitfalls of 'big data discrimination'?

Lawyers react to FTC report suggesting organisations are potentially excluding whole sections of society by relying on big data analytics

Businesses are at risk of causing "harmful exclusion or discrimination" by the improper use of big data.

That's the key warning in a report by the United States Federal Trade Commission - the US government agency tasked with protecting consumers - which suggests that in "the era of big data" there are both benefits and risks to mass analysis of customer activity.

Titled ‘Big Data: A Tool for Inclusion or Exclusion', the FTC report also warns that corporations need to consider whether their "reliance on big data raise[s] ethical or fairness concerns". The FTC notes that fairness questions arise over how big data and analytics might be used.

"For example, one company determined that employees who live closer to their jobs stay at these jobs longer than those who live farther away," said the report.

"However, another company decided to exclude this factor from its hiring algorithm because of concerns about racial discrimination, particularly since different neighbourhoods can have different racial compositions," it added.

Commenting on the report, Nicola Fulford (pictured), head of data protection and privacy at law firm Kemp Little, argued that while big data has been "revolutionizing the commercial world" by allowing businesses to target different segments of society, there are concerns that "big data discrimination" could "potentially disenfranchise whole segments of the population".

"Individuals may suffer negative consequences as a result of the automated analysis of their personal data, which could lead to discrimination in credit, employment, health, pricing and many more areas of life," Fulford told Computing.

"For example, certain promotions may only be offered to individuals under the age of 50 or individuals with university degrees," she added.

Fulford explained that current UK data protection laws contain restrictions on the extent to which businesses can make decisions about individuals using automated means, so "an individual has the right to ask a business to stop taking such automated decisions using their personal data".

Nonetheless, Fulford explained that businesses still need to be mindful about how they process data.

"It is worth bearing in mind that data protection laws will only apply where data is about a specific individual; it may not apply to segmentations of people. So groups of individuals may still suffer discrimination as a result of big data analytics, unless prevented by discrimination laws which make it unlawful to discriminate against people with 'protected characteristics'," she said.

Upcoming European Union legislation on data protection will somewhat alleviate this issue, she said.

"The new General Data Protection Regulation (GDPR) will contain new restrictions on profiling, a broader definition of what is classified as personal data (including IP addresses and device IDs) as well as a higher standard of consent that must be given by the individual before any profiling can take place," said Fulford.

Phil Lee, partner in the privacy, security and information group at European law firm Fieldfisher, also suggests the new legislation will make organisations more accountable with regard to what they can and can't do with data.

"These have specifically been designed with big data in mind - with rules around minimizing data collection, not 'repurposing' data for different uses, profiling and the right for individuals to be 'forgotten' from datasets," he told Computing.

"Combined with new accountability obligations, these will place additional compliance duties on big data operators that seek to ensure they use data in ways that are fair, ethical and lawful," Lee continued. "And if they don't, they face a big stick - potential fines of up to four per cent of global turnover."

Back in the US, the FTC also has some advice for corporations that are using or considering using big data analytics.

"Remember that just because big data found a correlation, it does not necessarily mean that the correlation is meaningful. As such, you should balance the risks of using those results, especially where your policies could negatively affect certain populations," reads the report.

Ultimately, despite all the automated benefits big data is supposed to bring, the FTC suggests that manual intervention by people is still necessary.

"It may be worthwhile to have human oversight of data and algorithms when big data tools are used to make important decisions, such as those implicating health, credit, and employment."

Computing's Big Data Summit returns in March 2016. To find out more - and to book your FREE place - sign up here.