Reading all the hype surrounding big data you could be forgiven for thinking that UK firms will suddenly be acquiring mind-reading powers on a par with X-Men’s Professor Xavier.
We are told that companies that have struggled for years to get accurate and timely reports from their enterprise applications will, in the future, be using in-memory analytics to transform their insights and their ability to innovate.
Since new analytics tools can access data in-memory without switching in and out of disk, reporting cycles that once took days are being done in minutes. Retail planners that once had to work off standard daily sales reports now realise they can launch programmes that analyse information sources for previously “hidden” insights. We are seeing enterprise departments – under daily pressure to deliver faster innovation for the business – asking their CIO and management consultants to deliver deeper, almost unheard of, customer insights.
In time, business will be able to determine whether key promotions, product lines, sales territories and customers are profitable enough – or even sustainable. Insurers that once relied on postcode-level data will soon examine not just the journeys but the manner in which policyholders drive their cars to determine premiums on an individual basis.
New and unexpected knowledge from analytics engines isn’t just a cutting-edge development, it could be a lifeline for businesses seeking to innovate in a challenging economy where capital budgets will remain tight for years.
So far, so good. But to my mind, the defining issue isn’t the technology’s impressive capabilities; it’s more about assembling a team of experts that can help an organisation through the process of acquiring these analytical platforms – without wasting money on badly-focused IT development programmes, or worse, increasing risk to the business.
In the past few months, we have seen lines of business teams working with their CIO, systems integrators and third-party specialists in data analytics tools, to deliver insight programmes that ensure predictable development costs and minimise disruption to daily operations.
Through their now-considerable track record in helping customers, these systems integrators and specialists have become experts in both key big data vendors’ toolsets and the project management skills needed for success. Through their proven record in installing real-time, high-volume analytics appliances, such providers are enabling UK companies to improve their decision-making and responsiveness – without being lured down blind alleys of development.
Next-stage software tools such as in-memory appliances will reduce query times as well as the overall demands on the organisation’s computing resources and reporting cycles. The difference is that these analytics installation specialists can work with systems integrators to address and resolve the infrastructure and software questions that inevitably surround a firm’s IT environment; issues that have often stymied previous big data projects.
Together, integrators and niche suppliers have the technology and project-management skills to solve the analytics challenge in manageable phases: scoping, risk assessment, data re-engineering, and implementation. These providers will help the IT team’s project management office to build crucial momentum within the business and achieve desired long-term results.
Delivering insights from data was once the computing equivalent of a shot in the dark, or worse, a complex IT project mired in costs. With the emergence of suppliers that can assess a company’s individual needs, analytics is gradually becoming a more contained process, delivered in phased programmes.
Forget all the hype of X-Men-style powers; the success of a big data analytics initiative will lie in the experience and proven skills of trusted consultants, technology integrators and their partners, which together ensure risk-managed, on-budget and effective programme delivery.
Glyn Heath is managing director of Centiq
• Do you agree that vendors will supply the expertise? Or will it come from within the
enterprise? Let us know by leaving a comment
• Computing Big Data Summit
'Capturing, storing and exploiting the information overload'
28 June 2012, Radisson Blu, London
To register, click here
Sometimes, the power of the mainframe is the most cost effective answer. Computing's Peter Gothard puts Computing's readers' questions on the future of the mainframe to IBM's Z13 expert Steven Dickens.
This Dummies white paper will help you better understand business process management (BPM)