The key to weathering the storage pricing fallout caused by last year's floods in Thailand doesn't lie in expanding memory, says Hitachi Data System's CTO Hubert Yoshida (pictured), but in using what's already available more effectively.
Speaking to Computing at HDS's Europe, Middle East and Africa conference in Barcelona, Yoshida said, "For 50 years we've had a [storage] price erosion of about 30 per cent annually. In other words, we would double the capacity every 18 months, and that would reduce the cost of the capacity – so, for the same price, you'd get more capacity.
"So what happened was, people would start to depreciate their hardware over four to five years because it was cheaper to buy new than maintain old."
The 2011 floods in Thailand knocked out that supply chain, causing a domino effect into the standard price erosion of the market.
"By end of summer, prices should be back to normal," said Yoshida. "There'll be some jockeying around with price cuts, but the prices long-term will not return to 30 per cent erosion. Because in addition, technologies for hard disk have slowed down."
As the industry now stands at the limit of perpendicular magnetic recording, with 3TB discs, explained Yoshida (taking into consideration Seagate's recently announced HAMR technology, which has crossed the 1 terabit per square inch storage barrier), there is still rapid deceleration in physical storage progress.
"Seagate said they project that, by the end of the decade, we'll have 6TB discs," said Yoshida. "That's eight years, which means we're not going to be doubling capacity every 18 months. So that means the price erosion is going to level out – to more like 15 or 20 per cent."
Yoshida continued: "That has a major impact; a major cost in IT and storage. So we have to become much more efficient in the way we use storage. That's why virtualisation is key.
"With the intelligence in the control unit, and the media separate, it's more like five to seven years' depreciation," said Yoshida. "That's a long time in terms of other technologies like thin provisioning.
"So with virtualisation, people have the opportunity to adjust their costs to compensate for price erosion. I don't think a lot of people realise the erosion won't be as fast as before, or the implications in terms of cost."
Sometimes, the power of the mainframe is the most cost effective answer. Computing's Peter Gothard puts Computing's readers' questions on the future of the mainframe to IBM's Z13 expert Steven Dickens.
This Dummies white paper will help you better understand business process management (BPM)