Two sets of disaster recovery (DR) statistics have been released in the past few weeks relating to disaster recovery...
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
planning. The first set of statistics is from Symantec Corp.'s State of the Data Center 2010 report.
First, according to Symantec's pollster, "Security, backup and recovery, and continuous data protection (CDP) are the most important initiatives in 2010, ahead of virtualization." Second, 79% said data backup and recovery was "somewhat/absolutely important," while 76% rated continuous data protection as a top initiative. And third, the survey also found that virtual machine (VM) data protection was a priority. Eighty-two percent of respondents said that they were considering virtual machine technologies in 2010, but believed that "granular recovery within virtual machine images" is the biggest challenge in virtual machine data protection.
Although it's not surprising that a Symantec study would discover trends favorable to disaster recovery and security spending due to both the thrust of the company and the current dependency on automation to facilitate staff downsizing in most firms, it was equally telling that one-third of respondents reported that their plans were undocumented and in disarray. Not covered are consequential IT components, such as virtual servers, remote offices, and cloud computing services. "Compounding the issue," the surveyors summarized, "almost one-third of enterprises haven't re-evaluated their disaster recovery plan in the last 12 months." To my mind, that means that these companies don't have a disaster recovery capability.
IDC's disaster recovery statics: Survey respondents getting better data backup tools
The second set of statistics on disaster recovery came from IDC on January 18. The good news was that 53% of survey respondents reported that they were purchasing data backup tools for software or services to protect data stored on desktops. This was "an encouraging development" according to IDC talking heads, possibly correcting a trend that had long overlooked the importance of data stored on user desktops, which is probably the most important to day-to-day operations in most organizations. Still, the study found that 32% of respondents reported that data backups were still the responsibility of the end user, and not coordinated by any centralized disaster recovery policy. So much for actually recovering from a facility or geographical disaster.
Meanwhile, I spent my Christmas vacation catching up with the folks at CA Data Protection (XOsoft and ARCserve) and Continuity Software (makers of RecoverGuard) getting previews of feature/functionality improvements under development for policy-based disaster recovery service management. I won't steal anyone's thunder, since formal announcements will be made late this month and early next month, but I feel that there is a need for comment.
As previously discussed in an earlier column, CA and Continuity Software are two products that are at the forefront of disaster recovery coordination and management. Unlike paperwork management systems like the stalwart TAMP Systems software-as-a-service (SaaS), which provides databases for storing and sharing DR plan-related data, the CA and Continuity Software folks want to get into the game of actual day-to-day management. But their approaches are very different.
In Continuity Software's case, the vendor has gone out of its way to embrace the on-hardware data mirroring and data replication services provided by data storage and server hardware vendors. They collect information on these processes from APIs provided by hardware providers and consolidate it for presentation on a dashboard screen.
Behind the scenes, RecoverGuard does some analysis and provides alerts when the boundaries for a disaster recovery strategy, expressed in terms of recovery time objectives (RTOs) and recovery point objectives (RPOs), are violated. That's good if you are using a brand-name vendor's hardware.
Spokespersons for the company said that large enterprises are their focus and that those firms tend to only buy brand names such as EMC Corp., Hitachi Data Systems (HDS), IBM Corp. and NetApp. So, their product has been steadily improving their support over time for effective monitoring of these products. They do not support all data storage or server products, nor do they capture data on replication processes that may be set up via third-party software, like CA XOsoft.
Conversely, CA, with products like XOsoft Replication and XOsoft High Availability, has been steadily improving its wares to capture software-layer data protection processes, while eschewing for the most part on-hardware based data replication or mirroring functions. CA takes an application-centric approach, seeking to help companies create manageable strategies for recovering applications and their data and their associated infrastructure in one fell swoop. Perhaps this strategy reflects the primary user community that CA sees for these products: small- to medium-sized business (SMB) units in large companies.
But here's the catch. If you deploy CA wares, and you want to continue using the value-add data protection software functionality that you licensed with an overpriced brand-name data storage kit, you end up with "black holes" in the visibility you have across your disaster recovery and data protection strategy. For example, CA doesn't capture the details of SRDF-based data replication on EMC rigs.
Conversely, if you deploy RecoverGuard, you can see supported on-hardware processes, but you won't get any monitoring capabilities to cover third-party wrap-around data protection and failover software processes like CA, or similar products from DoubleTake Software Inc., NeverFail Group, Symantec, etc. And that can leave gaping holes in your active disaster recovery monitoring capability.
CA's application-level approach to DR is something I like, of course, and I am seeing a lot of companies dropping their expensive licenses for internal on-hardware data replication in favor of letting XOsoft manage data protection and failover processes externally. Continuity Software will eventually wake up to this fact as they encounter more and more Global 2000 firms using CA XOsoft or other external "DR wrappers" instead of multiple on-hardware replication schemes. For now, they seem to be comfortable supporting big brand hardware players.
With that said, CA might find more traction with their XOsoft product suite if the company begins providing an optional API-based integration with the gear from hardware leaders that will enable them to capture proprietary data off these rigs in the XOsoft console and the disaster recovery scenarios that customers create. I may be talking out of turn, of course, since CA, more than just about any company out there, knows the vicissitudes of API-level integration. It has years of experience trying to skin the proverbial cat of enterprise management, which depends on establishing and managing friendly relations with a lot of proprietary hardware vendors whose hearts aren't really in to any sort of common management scheme. Frankly, it strikes me as nothing short of miraculous that Continuity Software has gotten the level of cooperation from the vendors whose hardware APIs they support.
For a lot of companies, corralling multiple and distributed disaster recovery processes into a common management scheme would be a godsend. This would allow a few people to keep tabs on a lot of complexity. Unfortunately, even the best-of-breed DR management software peddlers are still trying to work out the right mix of functionality. Either you use hardware-based data protection and data replication schemes, and integrate these together for management purposes via proprietary APIs that the respective vendors can alter at will (leaving gaping holes in visibility). Or, you usurp the on-hardware data protection and replication functionality that you paid too much for in the first place and instead use a DR wrapper that does all replication as an external process.
In most cases, companies will use a little of both on-hardware and external wrapper functionality for disaster recovery. Maybe CA and Continuity Software can come together to integrate with each other's products thereby giving a more comprehensive view of what's going on.
About this author: Jon Toigo is a veteran planner who has assisted more than 100 companies in the development and testing of their continuity plans. He is the author of 15 books, including five on the subject of business continuity planning, and is presently writing the 4th Edition of his "Disaster Recovery Planning" title for distribution via the Web. Readers can read chapters as they are completed, and contribute their own thoughts and stories, at the Disaster Recovery Planning 4th edition website.