Okay, maybe this is dangerous to try, but I need to resolve this problem,
so here goes...

I have a backup which includes, unfortunately, a directory containing 
233,049 files.  It's a deprecated temporary area that a long-running
process has been dumping files into for a long time, and had been for-
gotten about until now.

So... that's an obscene number of files for a single directory to hold
under the best of circumstances, but it seems that BackupPC_Link is getting
really bogged down slogging through this moby huge directory (which is
stored on a FireWire-connected drive, only making that worse).

Since I don't want that data anyway, I'd just as soon cut my losses at 
this point and skip over them.  If I were to stop BackupPC_Link, then
go in to /.../pc/$HOST/0 and delete the offending files, then re-start
BackupPC_Link, will it get confused at the missing files?  Or should I
also edit NewFileList.0 and remove the offending entries from that as 
well?

For that matter, should I just edit NewFileList.0 and remove EVERY line
up to and including the place it's stuck now?

And as long as I'm asking questions, I have multiple mountpoints being
backed up, and NewFileList.0 has an entry for "/attrib" before each
of the sets.  Is that duplicate entry going to cause a problem?

Thanks!


-------------------------------------------------------
This SF.Net email is sponsored by xPML, a groundbreaking scripting language
that extends applications into web and mobile media. Attend the live webcast
and join the prime developer group breaking into this new coding territory!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=110944&bid=241720&dat=121642
_______________________________________________
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/

Reply via email to