Computing research: how and why big data has hit the mainstream

By John Leonard
10 May 2012 View Comments
Big data artwork

Some IT trends can be seen coming from a long way off. Heralded by a cloud of dust on the horizon, a low rumble on the specialist blogs, and a flame war among the cognoscenti, The Next Big Thing rides into town.

Entrepreneurs plan for its arrival, columns grow in the tech press, discussion groups proliferate on the web and new acronyms and metaphors are created. Eventually, The Big Thing (as it is now) becomes a topic of casual conversation, even among those with no technical nous.

Big data is the latest such buzzphrase. But is it really new, or simply an old technology rebadged – like those aspects of cloud computing that we called ASPs (application service providers) and utility computing in the 1990s?

The truth is that big data is an old idea whose time has come, thanks to the twin forces of technological innovation and economic necessity – just as cloud computing has become viable on the back of broadband and mobile communications, combined with those constant forces propelling the IT industry forward: faster processors and cheaper, bigger storage.

Also like "cloud computing", the blandness of the phrase "big data" adds to many users' confusion. It has come to mean, variously: the explosion in the volumes of data generated, stored and processed; real-time analytics; virtualised and parallel processing; data visualisation; and many other things.

You cannot blame vendors for hitching their own wagons to the train. The big players – Oracle, IBM, SAP, Microsoft – have spent billions buying up analytics and data management firms, and "big data" is a term they want to own. However, pushing the debate down such narrow tracks does not lead to clarity; indeed, it can simply lead to cynicism among IT professionals.

Asked by Computing earlier this year about their opinion of the term "big data", 28 per cent of senior IT professionals at large UK organisations replied "vendor hype". Thirty-seven per cent saw big data as a big problem (presumably because they associate the term with increased data volumes), while 27 per cent said that it is shorthand for a big opportunity that few organisations grasp.

fig8

Take two organisations, identical in every respect apart from the quality of their data analytics. The firm that makes the best use of the largest quantity of the data it holds – turning that raw data into usable information, and then knowledge – will be the one that gets ahead. Analysts at Forrester Research estimate that enterprises use only five per cent of their available data, leaving the field open to those that can corral the remaining 95 per cent into a usable form.

Hidden value

The vast majority of Computing's survey respondents (72 per cent) recognise that their data holds hidden value. For any company – not just the oil giants and hedge funds most associated with big data – near-real time analysis of diverse pools of data has the potential to illuminate business trends, unlock new sources of economic value, improve business processes, and more. However, this presents a massive challenge.

 

Reader comments
blog comments powered by Disqus
Newsletters
Windows 10 - will you upgrade?

Microsoft has made an early version of Windows 10 - its next operating system - available for download. The OS promises better integration and harmonisation across platforms, including mobile and desktop. Will your business be upgrading?

36 %
34 %
11 %
19 %