> -----Original Message----- > From: ADSM: Dist Stor Manager [mailto:ads...@vm.marist.edu] On Behalf > Of Steven Harris > Sent: Thursday, April 30, 2009 6:23 PM > To: ADSM-L@VM.MARIST.EDU > Subject: Re: best backup method for millions of small files? > > Hi Norman > > Your post worries me, as I'm just implementing an email archive > solution > that will depend on windows journalling to back up some huge > repositories. > The particular product fills up "containers" that once filled never > change, so the change rate will be low there, but there are also index > files that will change often.
I'm way behind on my emails, but thought I would respond to this anyway. I have worked with some imaging/archiving solutions in the past with millions of small files. For the ones that fill up the containers/directories until full and then never change them again, I normally implement a combination of archives and backups. It can be a somewhat manual process, but may be preferable to some other options if the numbers of files are extremely large. In all cases where I have done this, the recovery time for archived images is often weeks so restore speed hasn't been a priority. 1. I archive the containers that are full one time to TSM with unlimited retention. 2. Once a container is archived, it is excluded from the backup process. 3. Incremental backups are scheduled every night, but should only backup/scan containers that are new since the last archive process. Most data should be excluded/not scanned. 4. Rerun the archive process once per month or a period that makes sense based on the number of containers that become full. Only archive full containers that haven't yet been archived. Make sure to add them to the backup excludes once successfully archived. ______________________________ John Monahan Infrastructure Services Consultant Logicalis, Inc. 5500 Wayzata Blvd Suite 315 Golden Valley, MN 55416 Office: 763-226-2088 Mobile: 952-221-6938 Fax: 763-226-2081 john.mona...@us.logicalis.com http://www.us.logicalis.com