In the aftermath of Hurricane Katrina, a Texas university put in a disaster recovery program that enabled it to easily survive Hurricane Dolly last month by keeping its email running and communications open.
IT specialists Alejandro Herrera and Brian Matthews of the University of Texas at Brownsville (UTB) said they received word from the administration to implement a disaster recovery plan in the wake of Katrina in late 2005. The key focus was to keep Microsoft Exchange and SharePoint running so the university could communicate with students, faculty and staff through email and the school's Web site.
"The goal was for IT to come up with a plan for email in case of a catastrophic event -- so if we were to be down days or weeks, we still would have ways to communicate," Herrera said.
"We looked into virtualization, tested VMware and that worked fine," he said. "Then we needed something to replicate email."
Herrera said they looked at EMC, Double-Take and other companies' products before settling on high-availability software from XOsoft, which is now part of the CA Recovery Management suite following CA's 2006 acquisition of XOsoft.
"We did not want to alter anything in our Exchange environment, which is mainly a Microsoft cluster for staff and faculty," Matthews said. "XOsoft was the only product that it seemed like we didn't have to alter anything. Other products would require us to break our cluster."
UTB's disaster recovery setup consists of a Windows cluster on an EMC/Dell Clariion SAN in its main data center on campus and a disaster recovery site with virtual servers running on a Dell PowerEdge 6950 with VMware and a Dell PowerVault MD1000 SAS box. XOsoft high availability is used to failover and failback Exchange and the SQL server hosting the Web site.
Herrera said the disaster recovery setup is useful for more than just disaster recovery. IT also uses its remote Austin site when doing maintenance or testing at the Brownsville campus.
"If you're replicating your student Exchange server and you need to do maintenance, you can fail it over and those students are running out of Austin," he said.
But the real payoff came when Hurricane Dolly arrived in late July. Unlike areas hit hardest by Hurricane Katrina, there was little flooding in Brownsville and only a few leaks in campus buildings. However, 100-mile per hour winds caused power failures that closed the campus for three days.
"The biggest problem was electricity," Matthews said. "There was a lot of wind damage, trees were down and power lines went down. Most of our city and the cities around us lost electricity. I was out 12 hours [at home], and Alex was out for three days. Some people were out seven days. But those with electricity had access to email."
The university's disaster recovery plan called for a shutdown if a storm was predicted to hit campus within 48 hours. The school closed down Wednesday July 23 when Hurricane Dolly was about 80 miles away.
"We went IT black," Matthews said. "That means the campus closes officially, buildings are shut down and sandbagged, and electricity is shut off. We close our servers and shut down the network, unplug the servers from the wall, switch to the Austin site, test it to make sure it works, and then shut everything locally. At that point, the email is running out of Austin."
All went according to plan. The SharePoint-driven Web site also failed over to Austin and was updated while school was closed. Between email and the Web site, the scattered population of students, faculty and staff were kept up to date on the university's status.
"Lot of emails went out, letting students know when to return to campus," Matthews said. "We sent out emails saying we would open Friday, and then another email went out saying wait until Monday."
Herrera said it took about six to 10 hours to bring everything back on Saturday. "First we turned on the SAN, then Active Directory servers and then we had to wait for everything to resynch back," he said. "It took overnight to fully synchronize."
The biggest problem for the university's IT department was getting the servers shut down in a timely fashion. Matthews said it took him and Herrera longer than expected to close the 44 physical and 60 virtual servers in the main data center.
"After you get to the 30th server, you're getting crazy," he said. "Security and the people sandbagging were saying, 'Are you finished shutting down servers yet?' We were almost the last ones to leave campus. Next time, we'll plan for more time to shut down the servers."