The recent fires at two separate Iron Mountain storage facilities caused the bubble to bust for many of us who take offsite data storage for granted. Most well-documented data backup and archive strategies include statements such as "send offsite" or "send to vault." Not much thought is ever given to what happens once the data gets there other than a relative feeling of security because the data is offsite, away from the threats to which your data center could be exposed.
Without trying to minimize the importance of making sure your vault provider offers adequate fire protection for your data, adhering to backup and archive best practices could practically reduce the impact of a vault fire to the cost of media. Here are some of the common sense best practices everyone should adhere to:
Make copies of your backups
In the world of paper records, there are often requirements to keep originals in a very secure place due to the unique and often irreplaceable nature of certain documents. But in IT, we deal with electronic records, where copies are as legitimate as the original set of ones and zeros. By keeping duplicates of your backups, one onsite and one offsite, data is protected from loss due to fire or something far more common such as media error. Should one of the copies be lost, it is a simple matter of recreating it from the other copy. It's a simple task with most backup software, and the losses are reduced to the time it takes to recreate the copy and replacement media cost.
Send copies offsite daily
Do not keep all of your eggs in one basket; data centers can burn, too. When is the last time you assessed your own fire suppression system? If you do not have remote data replication capabilities, there is not much point in making copies daily -- if they are sent offsite weekly -- unless your only concern is media error.
Make copies of archives, too
There often is a belief that archives don't really need to be duplicated. After all, this is no longer production data, and we might not need to access it for a long time, if ever. Archived data is probably even more sensitive as it is often the only record left; the original record was deleted from the production system and transferred to offsite bound media. Unless data is archived "just in case we ever need it," it should be duplicated and kept in a separate location, especially if it is tied to any compliance requirement.
If cost is an issue
If an organization thinks that duplicating backups is too costly, then one must consider how critical their data is. In such cases, one must weigh the cost of lost data with the cost of adequately protecting that data. Logic would then dictate that the lowest cost is likely the way to go. But beware of hidden costs; while estimating the cost of duplicate backups is fairly straight forward, calculating losses is far less simple.
In any case, single backup copies are generally a bad idea. Even if the vault is fireproof and next door to the fire department, backup media is exposed to risk the moment it leaves your hands.
Do you know…
About the author: Pierre Dorion is a certified business continuity professional for Mainland Information Systems Inc.
This was first published in June 2008