IT leaders grappling with “big data” find that the underlying challenge has many dimensions that extend beyond the actual volume of data. It is important for IT leaders to understand the impacts of big data on their existing approaches to information management. At the same time, they need to deploy new technologies and practices to manage data extremes — because traditional methods will fail.
There are four major impacts of big data on the IT organisation that IT leaders must be aware of, and act on, to respond to issues in a timely and effective manner.
• IT and business professionals must re-evaluate their practices for selecting which data to integrate. One of the greatest impacts of big data — and the analytics technology brought
to bear on it — is the creation of a world in which potentially any information can be used to create value for a business. No longer is analytics confined to rigid sets of structured information held in databases. The volume of analysable data is growing exponentially and becoming more portable, while also being available in a greater variety of formats than ever — through mobile devices, social media, advanced business information systems and many other sources. The businesses that can best harness all forms of data – structured and unstructured – will reap the greatest rewards.
Gartner estimates that, by 2015, companies that take account of the full range of information management issues in their IT strategies will begin to outperform unprepared competitors by 20 per cent, on every available financial metric. IT and business professionals must ensure they are challenging and refining their existing practices for selecting which data to integrate on an organisational level.
• IT support teams will have to support big-data initiatives begun by end-users, which could cause funding issues. One of the challenges of big data will be to integrate separate big-data projects carried out in different parts of an organisation. Business executives, having noted the success of individual, siloed analytics projects, will want to extend this success throughout the company. They will therefore ask the IT department to develop a strategy for supporting expanded uses. However, this is likely to cause friction as IT managers usually lack access to business unit budgets, or funds for such projects are considered one-time investments. Moreover, since end-users who deployed big data systems on a limited scale won’t have time to support an enterprise-wide rollout, IT staff will be left with the burden of supporting systems they had no role in creating.
Gartner recommends that organisations start allocating staff and budget for the integration of end-user deployments, and that they plan to decommission deployments that are not used by at least three business units.
• Data warehouses will undergo major revisions in order to address big data, or face decommission. In the past, there were several options for data warehousing, such as centralised enterprise data warehouses, virtual warehouses and federated marts. The common theme was that they were repository-orientated. In the age of big data, however, a repository-only style of warehouse will be overwhelmed by simultaneous increases in the volume, variety and velocity of data.
Gartner recommends that organisations start evolving towards a logical data warehouse (LDW) — a system that de-emphasises dedicated repositories and takes an architectural approach to sharing and accessing data. By conducting trials of LDW concepts and working towards a single, logically consistent information resource, organisations can start to create a platform that can cope, and deliver value, in the age of big data.
• Using context-aware algorithms to filter big data for customers may actually drive customers away. In the brave new world of transparency and customer fairness legislation, online services could use advanced algorithms to make real-time decisions about how customer data is used, and to define the choices presented to customers. The aim would be to personalise organisations’ interaction with customers and improve satisfaction and experience levels. However, organisations must be aware that customers’ perceptions of the use of big data can be adverse — the perception that a company has used personal data in a way that its customers don’t want can undermine trust and weaken a brand’s reputation, even if the company has not breached compliance regulations.
Gartner recommends that organisations run at least one pilot initiative to assess
how well their personalisation and context-aware data-handling capabilities match customers’ expectations and legal requirements.
Mark Beyer is a research vice-president at Gartner and David Cearley is a vice-president and Gartner Fellow
• Do you agree? Or does the concept of big data have no bearing on your day-to-day operations? Let us know by leaving a comment.
• Computing’s Big Data Summit 2012 is on 28 June in London
Sometimes, the power of the mainframe is the most cost effective answer. Computing's Peter Gothard puts Computing's readers' questions on the future of the mainframe to IBM's Z13 expert Steven Dickens.
This Dummies white paper will help you better understand business process management (BPM)