On Fri, 2005-10-14 at 08:12, Jonathan Dill wrote:
> >
> >> I've just installed BackupPC to replace a simple script that tarred 
> >> two folders to a remote machine. The script took about 3 hours for 
> >> 40gigs of data. BackupPC has now been running for almost 7 hours (1AM 
> >> to 7AM), and hasn't finished yet.

> >> Compression level is set to 3. Is there something else that could be 
> >> affecting it?
> >
> I'm new to BackupPC, but as I understand it, it calculates checksums for 
> files then compares to the file pool, so there's got to be some overhead 
> from that at least, especially if it has to decompress files to 
> calculate the checksums.  Subsequent incremental backups will probably 
> take a lot less time, though.

The biggest difference is probably that backuppc creates a separate
file (actually 2 links) for every new file on the source.  With many
small files the file creation overhead would be more than compression
and checksums - depending partly on CPU speed/disk speed.  Incrementals
should run much faster and subsequent fulls at least slightly faster.

-- 
  Les Mikesell
    [EMAIL PROTECTED]




-------------------------------------------------------
This SF.Net email is sponsored by:
Power Architecture Resource Center: Free content, downloads, discussions,
and more. http://solutions.newsforge.com/ibmarch.tmpl
_______________________________________________
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/

Reply via email to