Hi Don; I quickly reviewed the replies, and I did not see many folks mention 'archiving' as an option. Putting aside for a moment the issues regarding the transportation of tapes, etc etc - what is the actual daily delta of changes on your datastore? How much data is added/changed, and how much stuff are you backing up that is completely static?
The usual culprit in this situation, if it is not database growth, is a ton of multimedia data. I will use an example of landscape architects. I had a client, a landscape architect firm, that had a large amount of data on their server (3+ TB); however the vast majority of it consisted of static project files, and specifically image and media files that changed little if at all over time. Nearline and offline options should be explored. With this client and a couple of others, I took the following approach: Stage 1: 1. A 'last modified' sweep was done of the filesystem to determine the actual delta of daily changes. 2. On a policy level, at the same time, it was determined when projects could be retired. 3. Static libraries of images and multimedia files were identified. Stage 2: 1. The 'daily full' backup job was modified to perform a weekend Full, daily Differential update. 2. An Archive job was created to perform a permanent tape archive of older data. 3. Static image libraries were pushed to either cheaper storage (mirrored SATA drives on an array) or nearline storage (labelled USB drives on a shelf). Stage 3: 1. Backup jobs with detailed results were submitted to management for signoff. 2. The Archive process was reviewed and found to be working effectively - old job files were actively scrubbed from the system in accordance with the Data Lifecycle policy. 3. Static libraries and resources were either archived or backed up once to a WORM tape and a shelf drive, then removed from the regular backup rotation. With such an approach, you may find that both data management and compliance requirements are eased. However, this is an intensive process requiring many meetings and a high pucker level and lots of sign-off when you actually exclude that data from the bacup and/or clean it off the fileserver. Really though, in the end it is the only sane approach aside from "expect to double your storage and backup requirements every two to three years," which is what the client were facing. And I also have to admit, as a consultant in the real world, I have had very few clients actually sign off on and carry this process through to the end. -- Durf On Wed, Jul 28, 2010 at 8:54 AM, Holstrom, Don <dholst...@nbm.org> wrote: > I have been backing up all our data to tape drives. A vice president of the > Museum likes to take a copy home regularly in case our machines blow up... > > But now we have nearly two terabytes of data. Tape drives go up to 1.7 T's, > but I can only find libraries going higher. > > What other options do I have, so the VP can still take home a copy of the > data? > > Extra HDs take so much time. > > ~ Finally, powerful endpoint security that ISN'T a resource hog! ~ > ~ <http://www.sunbeltsoftware.com/Business/VIPRE-Enterprise/> ~ > > -- NEW PHONE NUMBER: tel: 617.671.0572 Just state your name and wait to be connected. -------------- Give a man a fish, and he'll eat for a day. Give a fish a man, and he'll eat for weeks! ~ Finally, powerful endpoint security that ISN'T a resource hog! ~ ~ <http://www.sunbeltsoftware.com/Business/VIPRE-Enterprise/> ~