Toigo: Archiving data can accommodate ongoing data growthDate: Jun 27, 2013
As the need to store greater amounts of data puts pressure on existing storage systems, IT administrators should begin looking at archiving data as a means of accommodating that growth. They should also provide adequate data protection for organizations, according to Jon Toigo, founder of Toigo Partners International.
"So disaster recovery [DR] has a vested interest in making sure this data growth is somehow checked so it doesn't overwhelm DR processes. There is no conceivable way that the current infrastructure that you're using for data protection and restore is going to keep up with 650% growth over a three-year period," Toigo said. "Ain't gonna happen. You're going to be asking management for more money than they're willing to spend on their production environment for a recovery…. So prepare for an adventure into the dark caverns of [the] archive."
Toigo said that DR planners have to include the data archiving process as part of an overall strategy of data preservation and protection.
"Archive is part of data preservation. It adds business value to your DR plan from a governance risk compliance perspective … and you already have the data you need to do an effective job of [archiving data] because you've mapped where the output is from all your applications," he said.
An archive is more than an extra copy of backup data. It's a resource that is "accessible and useful," said Toigo.
One approach can be to monitor the data usage of individual employees.
"Run a report. You got storage resource management software … [and] the report looks at who's using the most data. That's discovered by comparing all file listings by the metadata parameter that's stored with the file that identifies the user of the file," he said.
A better approach to archiving data, Toigo said, would be to implement a file segregation scheme that can sort data from different business operations and move it to tape NAS.
He said another approach is information lifecycle management, which requires administrators to understand what data is being created and how it's being used to determine the appropriate storage tier for data and whether data should be archived or even deleted.
"The best approach of course is ILM (information lifecycle management). Let's understand what the data is. Let's let users feel some kind of intrinsic responsibility for the maintenance and management of their own data," said Toigo. He later noted that "Ideally, we would apply a set of policies to data, and after 30 days, this kind of data gets moved off into an archive, and this kind of data goes into a review repository where it is deleted."