Access your Pro+ Content below.
10 mistakes to avoid in your disaster recovery planning process
This article is part of the February 2014 Vol. 12 No. 12 issue of Storage magazine
Don't make your disaster recovery planning process even harder than it is by trying to do too much or cutting corners. Careful planning is your best bet for a successful recovery. At the start of the new year, many IT folks (and perhaps a few business managers) resolve to take steps to prevent avoidable interruption events and to cope with interruptions that simply can't be avoided. In short, they decide to get serious about data protection and disaster recovery planning for business IT operations. Why the disaster recovery planning process can be so tough Disaster recovery (DR) planning is a complex and time-consuming task when done properly, which helps to explain why, for the past few years, surveys have shown the number of companies with continuity plans on the decline. In one annual PricewaterhouseCoopers study, companies with DR plans are down from roughly 50% of those previously surveyed to approximately 39% last year. Of these companies, the ones that actually test their plans are usually a fraction of those that claim ...
Access this Pro+ Content for Free!
Features in this issue
This "Sweet 16" roster of storage products represents the leading technical innovation of the past year.
Don't make your DR planning process harder than it is by trying to do too much or cutting corners. Careful planning is key to a successful recovery.
There are two sides to the big data story: analytics using vast numbers of small files, and dealing with storage for really big files.
Our latest survey charts the storage architecture alternatives readers are using in their storage shops.
Columns in this issue
Cloud closures, flash-in-the-pan solid-state vendors … storage might seem a little more dangerous these days, but it just might be innovation at work.
Filling drives with helium doesn't advance the art of hard disk design, it just makes it possible to stuff more old tech into a new package.
There aren't many reasons not to virtualize your servers, but there are plenty of compelling data protection reasons to virtualize them all.
Using Hadoop to drive big data analytics doesn't necessarily mean building clusters of distributed storage; a good old array might be a better choice.