James Taylor, senior solutions engineer at backup appliance vendor Sepaton, sees very little movement of mission-critical systems to the cloud.
"The companies that we typically deal with... they've moved small parts of their environment to the cloud but the core of the business remains very much in-house," he told Computing.
As large complex organisations, Sepaton's typical customers have a lot to manage, and handing over some responsibility to a third party would be an attractive option, but in the end it comes down to security and risk: do internal policies allow data to be managed outside?
"In my experience there's more focus on key control indicators [than on using a third party to reduce the workload] - making sure they have the controls in place to manage their data effectively, making sure they can recover data in the time required," said Taylor.
That is not to say that these firms are not using the cloud for backup at all.
"Things are becoming fragmented. There's a level of tiering regarding the type of data and services they need to get back and within what particular timeframe - what's written in the SLA - but in general customers are more concerned about keeping their own control over their data than farming the problem out. Things such as email will definitely be farmed out, but core business components - the lifeblood of the business? Organisations will ensure these are protected to the required level."
Sepaton has a foot in both camps, finding a niche for its backup appliance solutions with MSPs in addition to its traditional market of large organisations such as banks, energy companies and public-sector bodies.
"We have the ability to do storage pooling. You have one big managed system but you can segregate it into many pools of configuration within that single system. Each pool has its own individual configuration - in multi-tenancy hosted environments this storage pooling aspect is very useful."
Data volumes are increasing all the time. Taylor argues that the firm's appliances are scalable enough to eliminate that particular advantage of the cloud for backup.
"Typically our target market is second-generation platform users who've bought a smaller box based on upfront costs. Then over a period of a year, two years, three years... that one box turns into four, or six or eight. Rather than having a single scalable platform that you can build modular arrays onto, or extra processing nodes, when you hit a critical peak that means a complete reinvestment as well as the ongoing maintenance for the original platform. We can scale up to two petabytes of usable disk with 80 terabytes per hour backup performance."
With compression and deduplication, Taylor claims, the two petabytes can be effectively turned into four or eight terabytes, and that the system can be upgraded without disruption.
Does Google know too much about you?
The trend towards non-desktop-based devices is enabling more flexible working practices and behaviours
Date: 29 May 2013
THIS EVENT HAS BEEN POSTPONED DUE TO ILLNESS. Business intelligence is enjoying an upsurge of interest. In an era in which businesses and organisations...
Date: 11 Jun 2013
The enterprise mobility summit will examine how organisations can manage the increasing array of endpoints which are enabling mobile computing in business....