It has long been recognised that there is some information that is best not stored at home. For many of years, documents such as wills, title deeds and similar papers have been lodged with organisations considered to be secure, such as government bodies, the church and solicitors.
From a personal point of view, the reasons are obvious: if your house burns down, you do not want the proof that you own it to go up in flames too. You want some continuity.
The same is obviously true of companies today. While all organisations are awash with information, some of it is vital to the day-to-day running of the company. In the event of a disaster – and the fears of a technical glitch or some other homegrown catastrophe have given way to the increasing threat of terrorist attacks – it is paramount for a company to get itself up and running as quickly as possible, whether for purely business reasons, or as a matter of civic importance.
That means being able to access a copy of the essential data – and the most recent copy – and integrate it into a duplicate IT infrastructure with the minimum of cost, effort and time.
All but the most short-sighted firms have a disaster recovery programme built into their day-to-day processes. At regular intervals, important information is backed up and then stored in a different location, where it should escape any breakdown or damage at the parent site, and be ready to populate databases and servers elsewhere.
Given that some analysts have indicated that more than 80 per cent of businesses go bankrupt within one year of catastrophic data loss, such precautions are an everyday business necessity.
For many years, the most popular choice was a firestore: a location remote from the main company which was often nothing more than a fireproof, bomb-proof and, in the case of many West Coast US companies, earthquake-proof vault in which the day’s magnetic tapes were shipped at close of business.
Of course, the idea of dedicated employees crawling from the wreckage of their buildings and making their way to the firestore was always idealistic, and, in today’s environment, laughable. Nowadays, companies are more likely to deploy a data vault, in which data is stored automatically and logically over dedicated lines or the internet, with the ability to be used within seconds of any disaster.
Law firm Lees Lloyd Whitley switched to digital storage when its tape-based data backup and recovery started to show its age. The equipment and software became unreliable, not particularly resilient and difficult to manage.
With its headquarters in Birkenhead on Merseyside, the organisation employs 220 employees and its data volume totals more than 500GB.
Switching to EVault InfoStage enabled IT staff to centralise the backup from remote offices to an electronic vault in Birkenhead, with EVault DeltaPro technology reducing the data volumes of the original backup by 98 per cent.
Many vendors offer an all-in-one backup/data vault/recovery system, but what should you be looking for when considering such a product?
First, you should look at the provisions for backup itself. Will the company help you determine the optimum frequency for your backups, dependent on your business needs? What about when the backups are taken? Is that your choice, or theirs? Can you back up generational data, allowing rolled out recovery from the optimum age of backup?
Ideally, you want a product which can support full, incremental and differential backups, covering all eventualities. And such backups should be transparent to the network; having your network slowing to a crawl every night at 6pm as the backup commences is hardly conducive to good business practice.
Investment in additional technology is also important. How will the data be transmitted? Over a dedicated line, with the associated costs? Or across the internet, with the associated bandwidth considerations? If you are transmitting across a public network, what provisions have been made for security? Will your data be encrypted? This is important, not just for the backup, but for the recovery as well.
Once stored in the data vault, how secure is your information?
Vendors will almost certainly deploy external server farms which hold the backups from more than one company. You must be sure that you, and only you, have access to your own information.
The organisation of the data is also important: you want the vendor to have stored it in a fully searchable form, with your search parameters rather than those imposed by the vendor. It is also obvious that the vendor’s equipment should be fault-protected: you do not want to find your backups sitting in the middle of a pile of broken-down kit when you need them the most.
Finally, does the vendor make provision for restoration of your corporate data? Will you be able to bring your business back up within hours of a disaster, or will you have to wait for the necessary window in the vendor’s schedule? What about infrastructure? Will the vendor provide the hardware and software you will need to restore your business?
There is also the bottom line to think about: is the cost of a data vault system cheaper than the cost of downtime, or is a traditional firestore actually the best bet?
As with all network provisions, a full cost and risk assessment should be carried out before selecting a solution.
Whatever you decide, it is clear that you must make a decision as to the backup and recovery of your corporate data. Unless you want to be one of those companies which never recovers from disaster.
Sometimes, the power of the mainframe is the most cost effective answer. Computing's Peter Gothard puts Computing's readers' questions on the future of the mainframe to IBM's Z13 expert Steven Dickens.
This Dummies white paper will help you better understand business process management (BPM)