Organisations need to focus now on the changes they need to make to take advantage of the performance and operational improvements that the data centres of the future will offer.
Speaking as part of a panel at IPEXPO this week, Anthony Saxby. information platform product manager at Microsoft, said that companies are on the threshold of having cheap and simple access to vast amounts of computing capability in the cloud, following on from the cheap storage that's already on offer today.
"Think about what could happen once people have access to vast amounts of storage, compute and networking bandwidth, then how do you exploit that? What do you do with data or applications that you couldn't think about previously because the investment in infrastructure was too much?
"The focus now needs to be on realising that potential. There are undoubtedly untapped opportunities in terms of other ways of using compute power, storage and networking we haven't thought of yet that will change the ways companies do business," said Saxby.
David Flynn, CEO of solid state memory vendor Fusion-io, pointed out that while change is coming, many organisations will still use traditional storage, compute and networking methods.
"We still have dinosaurs of datacentres around. People still programme in COBOL, or use mainframes. But we're finally at the cusp of moving from the era of proprietary closed systems, to one where storage itself can be distributed around among high-performance commodity systems. This will change the pace of innovation."
Javier Benitez, senior network architect at telecommunications company Colt, explained what his firm wanted from next-generation datacentres.
"We want to enable customers to change service characteristics automatically, including bandwidth and Quality of Service. We want to be able to automate that change process, either through a customer-managed portal, or give APIs to a customer-managed application so they can do it themselves.
"We do that today, but it's very inefficient, and it's hard work," added Benitez.
Saxby stated that in order to be able to cope with the coming changes, and benefit from the advances in technology, it is crucial that organisations employ teams with the right level of training - and mindset - to embrace change.
"CIOs have to spend a lot of time worrying about maintaining service levels, and any change proposes risk," said Saxby. "Change brings an element of fear, but IT departments need to invest in people to ensure that from a training and skills point of view you can take calculated risks so you can understand and not fear the unknown."
Computing's upcoming Enterprise of the Future event will include a strand on the 'Datacentre of the Future'. For details on attendance and sponsorship opportunities, click here.
Sometimes, the power of the mainframe is the most cost effective answer. Computing's Peter Gothard puts Computing's readers' questions on the future of the mainframe to IBM's Z13 expert Steven Dickens.
This Dummies white paper will help you better understand business process management (BPM)