Re: [BackupPC-users] High Backup Data Volumes After Re-adding an Excluded Directory

2010-07-19 Thread Josh Malone
On Sun, 18 Jul 2010 10:24:08 -0400, "Norbert Hoeller" 
wrote:
> While trying to diagnose the high backuppc data volumes issue posted to
> the mailing list on June 14th, I had excluded a directory structure
> containing about 140MB of data. I removed the exclude once Craig had
> provided a fix for File::RsyncP and noticed that backup volumes jumped
by
> about 150MB. Tracing suggested that all the files in the previously
> excluded directory structure were being backed up on every incremental
> backup, even though the content of the files was unchanged (the first
> incremental backup after the directory was added indicated that backuppc
> had found the file in the backup pool). 
> 
> Although the contents of the files had not changed, I had 'touch'ed the
> files during the period where the directory structure has been excluded
> so that Google Sitemap would index them . It seems that the backuppc
> incremental backup got confused and repeatedly selected the files for
> backup even though the file date was no longer changing.  
> 
> File::RsyncP/rsync should have determined that the contents of the files
> were identical to the pool copy. Verbose logging suggests that checksums
> were exchanged, but rsync did nothing with them (the remote system
> reported false_alarms=0 hash_hits=0 matches=0). The reason is not clear.
> I had enabled checksum caching at one point but disabling checksum
> caching it did not change the symptoms. 
> 
> The problem was 'fixed' by doing a full backup. It appears that this
> caused rsync to properly compare checksums and backuppc updated the file
> date - the next incremental backup did not check the files that
> previously had been copied in full. I 'touch'ed one of the files and
> verified that the next incremental backup checked the file but rsync
> found no changed blocks.

During incremental backups, backuppc will backup any files not already in
the previous full, IIRC. Also, unless you change it in the config, backup
dump levels don't increment on each successive incremental; each
incremental is a "level 1" meaning "Backup all files changed since the
previous full".

You can change "IncrLevel" in the config to "1, 2, 3, 4, 5, 6, 7" or such
to change this behaviour.

-Josh


-- 

   Joshua Malone   Systems Administrator
 (jmal...@nrao.edu)NRAO Charlottesville
434-296-0263 www.cv.nrao.edu
434-249-5699 (mobile)
BOFH excuse #81:

CPU-angle has to be adjusted because of vibrations
coming from the nearby road


--
This SF.net email is sponsored by Sprint
What will you do first with EVO, the first 4G phone?
Visit sprint.com/first -- http://p.sf.net/sfu/sprint-com-first
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] High Backup Data Volumes After Re-adding an Excluded Directory

2010-07-18 Thread Norbert Hoeller
While trying to diagnose the high backuppc data volumes issue posted to 
the mailing list on June 14th, I had excluded a directory structure 
containing about 140MB of data.  I removed the exclude once Craig had 
provided a fix for File::RsyncP and noticed that backup volumes jumped by 
about 150MB. Tracing suggested that all the files in the previously 
excluded directory structure were being backed up on every incremental 
backup, even though the content of the files was unchanged (the first 
incremental backup after the directory was added indicated that backuppc 
had found the file in the backup pool).

Although the contents of the files had not changed, I had 'touch'ed the 
files during the period where  the directory structure has been excluded 
so that Google Sitemap would index them .  It seems that the backuppc 
incremental backup got confused and repeatedly selected the files for 
backup even though the file date was no longer changing. 

File::RsyncP/rsync should have determined that the contents of the files 
were identical to the pool copy.  Verbose logging suggests that checksums 
were exchanged, but rsync did nothing with them (the remote system 
reported false_alarms=0 hash_hits=0 matches=0).  The reason is not clear. 
I had enabled checksum caching at one point but disabling checksum caching 
it did not change the symptoms.

The problem was 'fixed' by doing a full backup.  It appears that this 
caused rsync to properly compare checksums and backuppc updated the file 
date - the next incremental backup did not check the files that previously 
had been copied in full.  I 'touch'ed one of the files and verified that 
the next incremental backup checked the file but rsync found no changed 
blocks.--
This SF.net email is sponsored by Sprint
What will you do first with EVO, the first 4G phone?
Visit sprint.com/first -- http://p.sf.net/sfu/sprint-com-first___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/