Craig Barratt wrote:
> Rich writes:
>
>   
>> I don't think BackupPC will update the pool with the smaller file even
>> though it knows the source was identical, and some tests I just did
>> backing up /tmp seem to agree.  Once compressed and copied into the
>> pool, the file is not updated with future higher compressed copies.
>> Does anyone know something otherwise?
>>     
>
> You're right.
>
> Each file in the pool is only compressed once, at the current
> compression level.  Matching pool files is done by comparing
> uncompressed file contents, not compressed files.
>
> It's done this way because compression is typically a lot more
> expensive than uncompressing.  Changing the compression level
> will only apply to new additions to the pool.
>
> To benchmark compression ratios you could remove all the files
> in the pool between runs, but of course you should only do that
> on a test setup, not a production installation.
>
> Craig
>   
The other point to keep in mind is that unless you actually need 
compression for disk space reasons leaving it off will often be faster 
on a CPU bound server.   Since there is a script provided 
(BackupPC_compressPool) to compress it later you can safely leave 
compression off until you need the disk space.

John

-------------------------------------------------------------------------
SF.Net email is sponsored by: The Future of Linux Business White Paper
from Novell.  From the desktop to the data center, Linux is going
mainstream.  Let it simplify your IT future.
http://altfarm.mediaplex.com/ad/ck/8857-50307-18918-4
_______________________________________________
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/

Reply via email to