Mainstream virtualisation software is well over a decade old now, and has had an enormous influence in helping organisations to consolidate their resources and run their operations with greater flexibility and efficiency.
Legacy and mission-critical software have historically proved to be a sticking point in the virtualisation process, however. In this, the first of a two-part series, we investigate whether this is still the case with reference to a Computing survey of 150 IT professionals drawn from medium to large public and private sector organisations.
Organisations of all types today need to be light on their feet, responding to changing market conditions fast. They must possess detailed knowledge not only of their markets but of their own businesses as well. They have to be able to spot opportunities and move quickly to exploit them. And they need to be able to identify costs, bottlenecks and problems in their organisations, or in their supply chains, before these can interfere with operations.
Sometimes, however, their information architecture trips them up. Data is fragmented throughout organisations on myriad different platforms, and organisations often face a proliferation of hardware and applications.
Bringing together all this data so that it can be better managed, analysed and utilised can be an extremely onerous task, and technical and institutional barriers can stand in the way.
Better on metal?
Every organisation deploys applications that are critical to its mission. So vital are these software systems that corporate stakeholders will do anything to safeguard their constant availability and performance.
Gradually over the last decade, organisations have migrated their applications into virtualised environments. But the all-important enterprise applications have tended to hold out. The capital and efficiency savings promised by virtualisation are of little consequence when set beside the perceived increase in business risk.
There are two up-shots to this reticence to integrate and consolidate these mission-critical applications.
The first is that running and maintaining them is more costly and unwieldy than for less mission-critical software. That much is understood and tacitly accepted by the corporate stakeholders.
The second is that organisations lose a major potential competitive advantage. By continuing with a situation where they may be deploying enterprise applications on thousands of individual platforms and file systems, organisations deny themselves the bigger picture they need to stay responsive, agile and competitive.
As a proxy for infrastructure and application sprawl, we asked about databases, which after all underpin most enterprise applications. The Computing survey shows that a high number of enterprise database instances are deployed in UK organisations, with about one for every 20 employees (see figure). What is more, it emerged that each database instance was feeding only about five major applications on average.
[Click to enlarge]
These findings confirm a picture in which many departments, branch offices or subsidiaries run their own standalone enterprise applications (often residing on dedicated hardware) that sit atop their own proprietary silos of data.
Business impact of IT proliferation
The single biggest factor behind infrastructure and application sprawl is merger and acquisition activity. According to the UK Office of National Statistics (ONS), about 300 to 600 medium to large British firms engage in mergers or acquisitions each year.
But according to a study by KPMG, about 83 per cent of corporate mergers fail to deliver the value and business growth predicted at the time of the merger. Instead, the merged organisation gradually becomes less than the sum of its parts.
By eliminating high entry costs for big data analysis, you can convert more raw data into valuable business insight.
A discussion of the "risk perception gap", its implications and how it can be closed