The disruptiveness of this process is usually defined by the amount of data change you have going on at the primary location, your available bandwidth, and how your data is being copied, mirrored or replicated to that secondary location.
If you're an architect, you should be interested in minimizing the data transmission while maximizing the synchronization between sites. Then focus on how to trigger failover while minimizing the time the operation takes. There are a bunch of technologies in this area; some technologies will vary, and that may modify how well you can execute.
There are some continuous data protection (CDP)-type replication technologies out there, such as InMage, that excel in these areas, minimizing data transmissions and maximizing synchronization between sites.
This was first published in August 2008