I have a large but still limited amount of total storage space
available, and am trying to figure out how to optimise dividing it up
between the "primary" NAS and the "BUNAS" that will hosting BackupPC's
volumes.

The least critical are of course media files, and in fact they don't
even really need to be stored on hard disk space, as I've been
archiving to DVD as I've been collecting/converting. They
**certainly** don't need to take up my valuable NAS space more than
once - in other words, if I do choose to store the more valuable files
there, they should not be backed up any further.

So here's my question - is it possible to set up a separate directory
"media tree" within the BackupPC pool's filesystem that is explicitly
intended to be exposed to "regular users" as a network share (could be
via samba, nfs or even offered up by a "check-in/out" web front end)?

The goal is to ensure that to the extent users have copies of these
files on their local filesystems, they aren't going to slow down the
backup sessions, as they'll already exist in the pool. If this isn't a
good idea, perhaps having such a folder tree "off to the side" from
BackupPC's pool but still in the same filesystem would work in
conjunction with periodically running one of the "dedupe/hardlinking"
scripts - any particular one recommended to use with BackupPC's pool?

----------

I'm going to set up and train users to use a top-level "don't back up
this tree" folder on their local filesystems, where they should be
placing unimportant but humungous files like these. However as we all
know, users will follow such a scheme inconsistently at best.

Which brings my next question - can anyone suggest a way to alert the
sysadmin (me) that a user did (mistakenly) have one of these ginormous
files in the filesystem being backed up? I'm guessing some sort of
cron-triggered script looking for new hardlinks to existing files
within my "media tree", ideally indicating the user and the file's
location in their filesystem.

Pointers to implementation examples of anything similar to this would
be great, or otherwise perhaps some hints to get me started?

Thanks in advance. . .

PS I never got a direct answer to my question about backing
up/transferring BackupPC's filesystem (but thanks to Pavel about
mentioning his RAID method)

=============
If no one's worked with FSArchiver, then how about feedback on what
y'all would choose between my current top two choices?

A - rsync the pool over + BackupPC_tarPCCopy the hardlinks dirstruc to
a temp location and then unpack it. I need to work on my scripting
skills anyway 8-(

B - full partition clone via something like dc3dd or Acronis True Image


Feature request: just MHO, but there really should be an option (both
Web and CLI) to just copy the whole cahuna over, as long as there's an
appropriate target device mounted. I can't imagine too many people are
dumping this stuff to tape??

------------------------------------------------------------------------------
Lotusphere 2011
Register now for Lotusphere 2011 and learn how
to connect the dots, take your collaborative environment
to the next level, and enter the era of Social Business.
http://p.sf.net/sfu/lotusphere-d2d
_______________________________________________
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/

Reply via email to