Data domain also has a limit to the amount of data one of their units can track but its fairly high. If you need more than 32TB of deduped data in PD you do need to deploy a seperate puredisk environment. Then PD just splits the hash space into chunks and stores data on whichever node it hases to. I'm not sure what their upper limit is but it can scale quite high also.
On 8/24/10, judy_hinchcli...@administaff.com <judy_hinchcli...@administaff.com> wrote: > Remember that the built in de-dup as a limit on the amount of data it can > keep track of. > Over that amount you need to use a de-dup appliance. > > > From: veritas-bu-boun...@mailman.eng.auburn.edu > [mailto:veritas-bu-boun...@mailman.eng.auburn.edu] On Behalf Of Alley, Chris > Sent: Tuesday, August 24, 2010 4:09 PM > To: veritas-bu@mailman.eng.auburn.edu > Subject: [Veritas-bu] Question for NetBackup 7.0 Dedup user OR Data Domain > users > > We are looking to change our backups to a disk based deduplication solution, > and 2 of our options are to utilize NetBackup 7.0's built in dedupe (Client > and Media server) or to put a Data Domain box in. I wanted to see if I > could get some real world feedback on what you guys have been seeing in > terms of dedupe rates, performance, etc. For example Data Domain claims we > would only see about 5:1 dedupe rate using NetBackup, which seems quite a > bit lower than what I would expect....and of course they claim they would > get about 20:1. I realize that all data is different, which is why I have > hopes that several people will reply with what they are seeing. Thanks for > your time! > _______________________________________________ Veritas-bu maillist - Veritas-bu@mailman.eng.auburn.edu http://mailman.eng.auburn.edu/mailman/listinfo/veritas-bu