Re: [BackupPC-users] Backup only new file(s)

2009-07-01 Thread Mirco Piccin
Hi all,

 Every day (except sunday) a procedure stores in this folder a 120GB file.
 The name of the file is the day name.

 So, in a week, i have 6 different files generated (about 720 GB).
 Every week the files are overwritten by the procedure.

 I'd like to backup only the newest file, and not all the folder.

i've finally solved my problem, thanks to all (in particular to Les  Jeffrey).
The method is not perfect, but it works :-D.
Here the host pl file:

$Conf{XferMethod} = 'rsync';
$Conf{ClientNameAlias} = '127.0.0.1';
$Conf{FullPeriod} = '0.97';
$Conf{RsyncShareName} = [
  '/media/smbfs'
];
$Conf{BackupFilesOnly} = {
  '*' = [
'/additional_path/filename.extension'
  ]
};
$Conf{RsyncClientCmd} = '$rsyncPath $argList+';
$Conf{RsyncClientRestoreCmd} = '$rsyncPath $argList+';
$Conf{DumpPreUserCmd} = '/bin/sh /usr/src/predump.sh
//10.0.5.3/tempshare $share /etc/backuppc/.smb_credentials
additional_path filename.extension';
$Conf{DumpPostUserCmd} = '/bin/sh /usr/src/postdump.sh $share
additional_path filename.extension';
$Conf{UserCmdCheckStatus} = '1';
$Conf{RestorePreUserCmd} = '/bin/sh /usr/src/mount_smbfs.sh
//10.0.5.3/tempshare $share /etc/backuppc/.smb_credentials';
$Conf{RestorePostUserCmd} = '/bin/sh /usr/src/umount_smbfs.sh $share';

(i've just changed from original host pl file:
additional_path : that's the directory in smb share where file to
backup is stored;
filename.extension : that's the chosen name for the file (the name
that will appear in browsing backup)
)

I've changed RsyncClientCmd and RsyncClientRestoreCmd (to make work locally).
I've created for DumpPreUserCmd a script; that script runs another 2 scripts:
- one to mount the remote share in the mount point defined ($share) -
(script used also in RestorePreUserCmd )
- one to rename the last modified file that is in mount point +
additional path as filename.extension; old file name is saved as .pid
empty file (that helps me also to verifiy if backup is running,);

Then starts rsync backup.

I've than created for DumpPostUserCmd a script; that script runs
another 2 scripts:
- one to rename the filename.extension to original file name and
remove .pid file and
- one to umount the remote samba share (script used also in RestorePostUserCmd).

For smbmount/smbumount i've configured sudo for backuppc user.

In this way, i have always the same file name, and that file is always
the last one created (or edited).
If it could help, i can pastebin scripts created.

Regards
M

--
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup only new file(s)

2009-06-11 Thread Les Mikesell
Mirco Piccin wrote:
 Hi all,
 i have to backup a folder (using smb).
 
 Every day (except sunday) a procedure stores in this folder a 120GB file.
 The name of the file is the day name.
 
 So, in a week, i have 6 different files generated (about 720 GB).
 Every week the files are overwritten by the procedure.
 
 I'd like to backup only the newest file, and not all the folder.
 The problem is that i suppose i must have a full backup of the folder
 (720 GB), because of $Conf{FullKeepCnt}  must be = 1, plus
 incremental backup.
 So, configuring:
 $Conf{FullPeriod} = 6.97;
 $Conf{IncrKeepCnt} = 6;
 
 i'll have :
 on sunday the full backup - 720 GB
 on monday the incremental backup  - 720 GB (the full backup) plus 120
 GB (the new monday file)
 on tuesday the incremental backup  - 840 GB (the full backup plus
 incremental) plus 120 GB (the new tuesday file)
 
 and so on, for a total of 1440 GB (the double of the effective disk
 space needed).
 
 And again, sunday BackupPC will move 720 GB of files, and so on.
 
 Is there a way to backup only the new file (maybe playing with
 $Conf{IncrLevels}), without a full?
 Or a way to optimize it?

I don't think there is a good way to handle this in backuppc.  Can you 
change the procedure so the current daily file is created in a directory 
by itself and older ones rotated to a different directory?  Then you 
could do a full of the one holding the current file every day and store 
as many as you want.

-- 
   Les Mikesell
lesmikes...@gmail.com



--
Crystal Reports - New Free Runtime and 30 Day Trial
Check out the new simplified licensing option that enables unlimited
royalty-free distribution of the report engine for externally facing 
server and web deployment.
http://p.sf.net/sfu/businessobjects
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup only new file(s)

2009-06-11 Thread Omar Llorens Crespo Domínguez
Mirco Piccin escribió:
 Hi all,
 i have to backup a folder (using smb).

 Every day (except sunday) a procedure stores in this folder a 120GB file.
 The name of the file is the day name.

 So, in a week, i have 6 different files generated (about 720 GB).
 Every week the files are overwritten by the procedure.

 I'd like to backup only the newest file, and not all the folder.
 The problem is that i suppose i must have a full backup of the folder
 (720 GB), because of $Conf{FullKeepCnt}  must be = 1, plus
 incremental backup.
 So, configuring:
 $Conf{FullPeriod} = 6.97;
 $Conf{IncrKeepCnt} = 6;

 i'll have :
 on sunday the full backup - 720 GB
 on monday the incremental backup  - 720 GB (the full backup) plus 120
 GB (the new monday file)
 on tuesday the incremental backup  - 840 GB (the full backup plus
 incremental) plus 120 GB (the new tuesday file)

 and so on, for a total of 1440 GB (the double of the effective disk
 space needed).

 And again, sunday BackupPC will move 720 GB of files, and so on.

 Is there a way to backup only the new file (maybe playing with
 $Conf{IncrLevels}), without a full?
 Or a way to optimize it?

 Thanks
 Regards
 M

   

Hi,

I think is better that you changer your xfermetoth to rsyncd. rsyncd 
only copy the new files.
Also you can change your configuration in $Conf{FullKeepCnt} = 1, 
because you only need the last copy and $Conf{IncrKeepCnt} = 1 or 2;

-- 


Omar Llorens Crespo Domínguez.
JPL TSOLUCIO, SL
o...@tsolucio.com
www.tsolucio.com
www.bearnas.com
902 88 69 38



--
Crystal Reports - New Free Runtime and 30 Day Trial
Check out the new simplified licensing option that enables unlimited
royalty-free distribution of the report engine for externally facing 
server and web deployment.
http://p.sf.net/sfu/businessobjects
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup only new file(s)

2009-06-11 Thread Mirco Piccin
Hi and thanks for the reply.

 Every day (except sunday) a procedure stores in this folder a 120GB file.
 The name of the file is the day name.

 So, in a week, i have 6 different files generated (about 720 GB).
 Every week the files are overwritten by the procedure.
...
 i'll have :
 on sunday the full backup - 720 GB
 on monday the incremental backup  - 720 GB (the full backup) plus 120
 GB (the new monday file)
 on tuesday the incremental backup  - 840 GB (the full backup plus
 incremental) plus 120 GB (the new tuesday file)

 and so on, for a total of 1440 GB (the double of the effective disk
 space needed).
...

 I don't think there is a good way to handle this in backuppc.  Can you
 change the procedure so the current daily file is created in a directory
 by itself and older ones rotated to a different directory?  Then you
 could do a full of the one holding the current file every day and store
 as many as you want.


Maybe that's the best solution, but that procedure is not open source...
I can eventually do an additional script (and schedule it) that does the job.

 I think is better that you changer your xfermetoth to rsyncd

Also this is a good solution; but i'd like to maintain the backup
agentless

Thanks
M

--
Crystal Reports - New Free Runtime and 30 Day Trial
Check out the new simplified licensing option that enables unlimited
royalty-free distribution of the report engine for externally facing 
server and web deployment.
http://p.sf.net/sfu/businessobjects
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup only new file(s)

2009-06-11 Thread Jeffrey J. Kosowsky
Les Mikesell wrote at about 07:57:11 -0500 on Thursday, June 11, 2009:
  Mirco Piccin wrote:
   Hi all,
   i have to backup a folder (using smb).
   
   Every day (except sunday) a procedure stores in this folder a 120GB file.
   The name of the file is the day name.
   
   So, in a week, i have 6 different files generated (about 720 GB).
   Every week the files are overwritten by the procedure.
   
   I'd like to backup only the newest file, and not all the folder.
   The problem is that i suppose i must have a full backup of the folder
   (720 GB), because of $Conf{FullKeepCnt}  must be = 1, plus
   incremental backup.
   So, configuring:
   $Conf{FullPeriod} = 6.97;
   $Conf{IncrKeepCnt} = 6;
   
   i'll have :
   on sunday the full backup - 720 GB
   on monday the incremental backup  - 720 GB (the full backup) plus 120
   GB (the new monday file)
   on tuesday the incremental backup  - 840 GB (the full backup plus
   incremental) plus 120 GB (the new tuesday file)
   
   and so on, for a total of 1440 GB (the double of the effective disk
   space needed).
   
   And again, sunday BackupPC will move 720 GB of files, and so on.
   
   Is there a way to backup only the new file (maybe playing with
   $Conf{IncrLevels}), without a full?
   Or a way to optimize it?
  
  I don't think there is a good way to handle this in backuppc.  Can you 
  change the procedure so the current daily file is created in a directory 
  by itself and older ones rotated to a different directory?  Then you 
  could do a full of the one holding the current file every day and store 
  as many as you want.
  

Couldn't you just do daily full backups (with no incrementals) while
setting $Conf{FullKeepCnt}=1. Then as long as you made sure that
BackupPC_nightly didn't run in the middle, you would effectively just
be adding one new backup to the pool each day and later when
BackupPC_nightly runs you would be erasing the entry from 8 days
earlier, so you would never have more than 720+120=840 GB in the
pool. Now this wouldn't be particularly bandwidth efficient since you
are always doing full rather than incrementals, but it would work...

However, if you really are only trying to backup a single new 120GB
file every day, I wonder whether you might be better off just using a
daily 'rsync' cron job. It seems like that would be simpler, more
reliable, and more efficient.

Also, is each daily file completely distinct from the previous one or
is just incrementally changed? Because if it is just incrementally
changed you may want to first rsync against the previous day's backup
to reduce network bandwidth.

--
Crystal Reports - New Free Runtime and 30 Day Trial
Check out the new simplified licensing option that enables unlimited
royalty-free distribution of the report engine for externally facing 
server and web deployment.
http://p.sf.net/sfu/businessobjects
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup only new file(s)

2009-06-11 Thread Mirco Piccin
Hi, thanks for reply.

    Every day (except sunday) a procedure stores in this folder a 120GB file.
    The name of the file is the day name.
   
    So, in a week, i have 6 different files generated (about 720 GB).
    Every week the files are overwritten by the procedure.
   
    I'd like to backup only the newest file, and not all the folder.
    The problem is that i suppose i must have a full backup of the folder
    (720 GB), because of $Conf{FullKeepCnt}  must be = 1, plus
    incremental backup.
...
    and so on, for a total of 1440 GB (the double of the effective disk
    space needed).
...
 Couldn't you just do daily full backups (with no incrementals) while
 setting $Conf{FullKeepCnt}=1. Then as long as you made sure that
 BackupPC_nightly didn't run in the middle, you would effectively just
 be adding one new backup to the pool each day and later when
 BackupPC_nightly runs you would be erasing the entry from 8 days
 earlier, so you would never have more than 720+120=840 GB in the
 pool. Now this wouldn't be particularly bandwidth efficient since you
 are always doing full rather than incrementals, but it would work...

 However, if you really are only trying to backup a single new 120GB
 file every day, I wonder whether you might be better off just using a
 daily 'rsync' cron job. It seems like that would be simpler, more
 reliable, and more efficient.

 Also, is each daily file completely distinct from the previous one or
 is just incrementally changed? Because if it is just incrementally
 changed you may want to first rsync against the previous day's backup
 to reduce network bandwidth.

My BackupPC is running on a VIA processor, max MB/s : less than 5 :-(
So, backup 840 GB each time is not the best solution ...
(this is the reason i did not configure the backup as you suggest)

Anyway, each daily file is quite similar to the each other, so rsync
(or custom script) should be the better way to to the job.

Regards
M

--
Crystal Reports - New Free Runtime and 30 Day Trial
Check out the new simplified licensing option that enables unlimited
royalty-free distribution of the report engine for externally facing 
server and web deployment.
http://p.sf.net/sfu/businessobjects
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup only new file(s)

2009-06-11 Thread Les Mikesell
Mirco Piccin wrote:
 
 Also, is each daily file completely distinct from the previous one or
 is just incrementally changed? Because if it is just incrementally
 changed you may want to first rsync against the previous day's backup
 to reduce network bandwidth.
 
 My BackupPC is running on a VIA processor, max MB/s : less than 5 :-(
 So, backup 840 GB each time is not the best solution ...
 (this is the reason i did not configure the backup as you suggest)
 
 Anyway, each daily file is quite similar to the each other, so rsync
 (or custom script) should be the better way to to the job.

That won't help unless each file is named the same as the previous one. 
Perhaps you could smb-mount the share into the backuppc server and move 
the files around so you always have the newest file under the same name 
in a subdirectory of the share for the duration of the backup - then you 
could put it back if you want. That would let you use the 'some number 
of fulls only' approach I suggested earlier and also transfer less data 
(but the rsync CPU vs. network tradeoff may be a wash).

If you don't use some approach to just get one file in the directory per 
day, you will probably run out of space on your 2nd full when you 
transfer the current week's files before the previous full can be 
deleted.  Or are you doing this already?

-- 
   Les Mikesell
lesmikes...@gmail.com


--
Crystal Reports - New Free Runtime and 30 Day Trial
Check out the new simplified licensing option that enables unlimited
royalty-free distribution of the report engine for externally facing 
server and web deployment.
http://p.sf.net/sfu/businessobjects
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup only new file(s)

2009-06-11 Thread Filipe Brandenburger
Hi,

On Thu, Jun 11, 2009 at 14:13, Les Mikeselllesmikes...@gmail.com wrote:
 Mirco Piccin wrote:
 Anyway, each daily file is quite similar to the each other, so rsync
 (or custom script) should be the better way to to the job.

 That won't help unless each file is named the same as the previous one.

You can try to use the -y or --fuzzy option to rsync (at least
rsync 3) to implement this.

Quoting from the man page: -y, --fuzzy: This option tells rsync that
it should look for a basis file for any destination file that  is
missing. The current algorithm looks in the same directory as the
destination file for either a file that has an identical size and
modified-time, or a similarly-named file. If found, rsync uses the
fuzzy basis file to try to speed up the transfer.

HTH,
Filipe

--
Crystal Reports - New Free Runtime and 30 Day Trial
Check out the new simplified licensing option that enables unlimited
royalty-free distribution of the report engine for externally facing 
server and web deployment.
http://p.sf.net/sfu/businessobjects
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup only new file(s)

2009-06-11 Thread Adam Goryachev
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Filipe Brandenburger wrote:
 Hi,
 
 On Thu, Jun 11, 2009 at 14:13, Les Mikeselllesmikes...@gmail.com wrote:
 Mirco Piccin wrote:
 Anyway, each daily file is quite similar to the each other, so rsync
 (or custom script) should be the better way to to the job.
 That won't help unless each file is named the same as the previous one.
 
 You can try to use the -y or --fuzzy option to rsync (at least
 rsync 3) to implement this.
 
 Quoting from the man page: -y, --fuzzy: This option tells rsync that
 it should look for a basis file for any destination file that  is
 missing. The current algorithm looks in the same directory as the
 destination file for either a file that has an identical size and
 modified-time, or a similarly-named file. If found, rsync uses the
 fuzzy basis file to try to speed up the transfer.

I'm assuming this doesn't help with backuppc because of the whole perl
module thing ? It would be interesting to see how fuzzy the filenames
can be?

20090601_Master_Backup.tar
20090602_Master_Backup.tar

or

Master_Backup_etc_0001.blah
Master_Backup_etc_0002.blah

or

Master Backup Tue Jun 10 2009.blah
Master Backup Wed Jun 11 2009.blah

The first two are only one char different (and at most 5 or 6 chars
different (20091231 - 20100101)) however the last one is more fuzzy...

Would be fantastic if backuppc was able to deal with this also!

Regards,
Adam

- --
Adam Goryachev
Website Managers
www.websitemanagers.com.au
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

iEYEARECAAYFAkoxzsgACgkQGyoxogrTyiXGDwCdF0JsCB1Y1Cdz6pigI+gGCgpX
z6AAn0Pa7xT6EuJA+nAeBv4/9e9Qhr1I
=uxr6
-END PGP SIGNATURE-

--
Crystal Reports - New Free Runtime and 30 Day Trial
Check out the new simplified licensing option that enables unlimited
royalty-free distribution of the report engine for externally facing 
server and web deployment.
http://p.sf.net/sfu/businessobjects
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup only new file(s)

2009-06-11 Thread Adam Goryachev
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Adam Goryachev wrote:
 Filipe Brandenburger wrote:
 Hi,

 On Thu, Jun 11, 2009 at 14:13, Les Mikeselllesmikes...@gmail.com wrote:
 Mirco Piccin wrote:
 Anyway, each daily file is quite similar to the each other, so rsync
 (or custom script) should be the better way to to the job.
 That won't help unless each file is named the same as the previous one.
 You can try to use the -y or --fuzzy option to rsync (at least
 rsync 3) to implement this.

 Quoting from the man page: -y, --fuzzy: This option tells rsync that
 it should look for a basis file for any destination file that  is
 missing. The current algorithm looks in the same directory as the
 destination file for either a file that has an identical size and
 modified-time, or a similarly-named file. If found, rsync uses the
 fuzzy basis file to try to speed up the transfer.

 I'm assuming this doesn't help with backuppc because of the whole perl
 module thing ? It would be interesting to see how fuzzy the filenames
 can be?

BTW, what is the possibility of having backuppc request the first 100k
or whatever of a file is needed to calculate the pool checksum, then see
if the file exists in the pool, and then we wouldn't need to re-download
(for example) the 30M linux kernel package, or the 300M windows sp3
file, etc... This would also solve the issue of renaming
files/folders... Can the rsync protocol handle sending just the first
portion of a file?

Of course the full file checksum would need to match as well :) probably
if we find a match in the pool. then re-start the transfer with this
target file, so that we will run the checksum over the entire file...

Regards,
Adam

- --
Adam Goryachev
Website Managers
www.websitemanagers.com.au
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

iEYEARECAAYFAkox0fQACgkQGyoxogrTyiX+pgCaAqg6FeQmMsX0MLMY9VaaEvZS
D28AoIae2FASFEFr8UdeCNzZOVDI0cQG
=0cy2
-END PGP SIGNATURE-

--
Crystal Reports - New Free Runtime and 30 Day Trial
Check out the new simplified licensing option that enables unlimited
royalty-free distribution of the report engine for externally facing 
server and web deployment.
http://p.sf.net/sfu/businessobjects
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/