[BackupPC-users] Arguments for --filter don't get proper escaping

2009-01-29 Thread Thomas Karcher
Hi there,

I'm thrilled of this just discovered feature of rsync: --filter

Now I want to use it in BackupPC, so I added this to $Conf{RsyncArgs}:

'--filter=:- /nobackup.txt'

My problem is that there seems to be something wrong with the escaping
of the space ... I observed that with an rsyncd as backup partner,
everything seems to work fine, but with a command line rsync, it's odd:

Running: /usr/bin/sudo /usr/bin/rsync --server --sender --numeric-ids
--perms --owner --group -D --links --hard-links --times
--block-size=2048 --recursive --checksum-seed=32761 --one-file-system
--filter=:-\\\ /nobackup.txt . /

Adding backslashes or quotes just worsens the situation and I get even
more escaping backslashes ... any ideas?


Thank you,
Thomas




signature.asc
Description: This is a digitally signed message part
--
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Arguments for --filter don't get proper escaping

2009-01-29 Thread Juergen Harms
A nice workaround to this backslashing problem is to place your filter 
statements into a file - say /home/user/rsync_filters - and add a 
corresponding statement to your arguments:
--filter=.\ /home/user/.usync/filters

If you later want to add additional filter statements to your 
command-line, your command line will probably blow up even if you become 
a backslash acrobat, a filter file makes things much easier

--
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] DumpPreUserCmd to send an email before and after backup

2009-01-29 Thread Nils Breunese (Lemonbit)
Regis wrote:

 I would like backuppc send an email to $host and or $user, to notify  
 when a full backup being to start

 I know it's by DumpPreUserCmd, but anyone can help me to write it?

You can use the mail command. Something like this:

/bin/echo This is the message | /bin/mail -s This is the subject  
to_address

Nils Breunese.

--
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Arguments for --filter don't get proper escaping

2009-01-29 Thread Holger Parplies
Hi,

Juergen Harms wrote on 2009-01-29 10:41:22 +0100 [Re: [BackupPC-users] 
Arguments for --filter don't get proper escaping]:
 Thomas Karcher wrote on 2009-01-29 09:59:51 +0100 [[BackupPC-users] Arguments 
 for --filter don't get proper escaping]:
  
  I'm thrilled of this just discovered feature of rsync: --filter
  
  Now I want to use it in BackupPC, so I added this to $Conf{RsyncArgs}:
  
  '--filter=:- /nobackup.txt'
  
  My problem is that there seems to be something wrong with the escaping
  of the space ...
 
 A nice workaround to this backslashing problem is to place your filter 
 statements into a file - say /home/user/rsync_filters - and add a 
 corresponding statement to your arguments:
   --filter=.\ /home/user/.usync/filters

hey, great idea - except that it doesn't solve anything. In what way, do you
suppose, is your space character different from the one above (other than
being escaped, making things worse)? On a side note, your argument does not
match the file you are suggesting.

 If you later want to add additional filter statements to your 
 command-line, your command line will probably blow up even if you become 
 a backslash acrobat, a filter file makes things much easier

That part is true. Just keep in mind that you then have a part of your backup
configuration that is not stored with the rest of the BackupPC configuration,
but rather on the backup client somewhere.

  I observed that with an rsyncd as backup partner,
  everything seems to work fine, but with a command line rsync, it's odd:
  
  Running: /usr/bin/sudo /usr/bin/rsync --server --sender --numeric-ids
  --perms --owner --group -D --links --hard-links --times
  --block-size=2048 --recursive --checksum-seed=32761 --one-file-system
  --filter=:-\\\ /nobackup.txt . /
  
  Adding backslashes or quotes just worsens the situation and I get even
  more escaping backslashes ... any ideas?

Actually, you need to use '$argList' instead of '$argList+' in
$Conf{RsyncClientCmd} (and $Conf{RsyncClientRestoreCmd}). You are not passing
the arguments through a shell (as you are when using 'ssh'), so you don't want
any escaping. That also means you should remove the backslash you apparently
have in there now - contrary to your statement above. The entry in RsyncArgs
should read '--filter=:- /nobackup.txt'. It is passed as one argument to
'sudo', which will pass it on as one argument to 'rsync', I believe. Quite
disappointing to backslash acrobats, really.

Regards,
Holger

--
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] DumpPreUserCmd to send an email before and after backup

2009-01-29 Thread Les Stott
Madcha wrote:
 Thanks,

 But how write this command, for send an email to $host? or $user?

   
Here is one way to do it, maybe not the most elegant but it works. Use 
it as a guide and improve on it. I set this up ages ago for a client, 
never tweaked it since then.

These will send a note before to say its starting, and then send a note 
after if it completes, whether successful or not. It will also log to 
the hosts backuppc log file. If there is an error it will log that also.

You need to create 4 files. the sample scripts below assume you put them 
in /usr/local/bin

startbkpemail.txt  - body text for email before
startbkpemail.sh   - script to send the email before
endbkpemail.txt   - body text for email after
endbkpemail.sh - script to send email after

Modify the variables at the top of the scripts to suit your setup.

Call them in BackupPC like so
$Conf{DumpPreUserCmd}= '/usr/local/bin/startbkpemail.sh $user 
$xferOK $host $type $cmdType';
$Conf{DumpPostUserCmd}= '/usr/local/bin/endbkpemail.sh $user $xferOK 
$host $type $cmdType';

they should contain something like:

startbkpemail.txt
-
THIS IS AN AUTOMATED SYSTEM MESSAGE!!
PLEASE DO NOT REPLY!!

Your Computer is being backed up by the
Automated System Backup Server.

During this time you may experience some slowness on
the network. This is usually only for a short period.

You will receive a confirmation E-mail when the
Backup Completes.
-

startbkpemail.sh
-
#!/bin/sh
#
#$Conf{DumpPreUserCmd}= '/usr/local/bin/startbkpemail.sh $user 
$xferOK $host $type $cmdType';
#$Conf{DumpPostUserCmd}= '/usr/local/bin/endbkpemail.sh $user 
$xferOK $host $type $cmdType';

varDate=`date '+%F %T'`
DOMAIN=yourdomain.com.au
BACKUPPCPATH=/path/to/data/directory
BACKUPPCLOG=$BACKUPPCPATH/log/LOG
notify=$...@${domain}

/bin/mail -sA Backup of $3 has started... ${NOTIFY}  
/usr/local/bin/startbkpemail.txt
/bin/echo $varDate Backup Started E-mail sent to:(' $1' ) $BACKUPPCLOG
-

endbkpemail.txt
-
THIS IS AN AUTOMATED SYSTEM MESSAGE!!
PLEASE DO NOT REPLY!!

Your Computer has been backed up by the
Automated System Backup Server.
-

endbkpemail.sh
-
#!/bin/sh
#
#$Conf{DumpPreUserCmd}= '/usr/local/bin/startbkpemail.sh $user 
$xferOK $host $type $cmdType';
#$Conf{DumpPostUserCmd}= '/usr/local/bin/endbkpemail.sh $user 
$xferOK $host $type $cmdType';

varDate=`date '+%F %T'`
DOMAIN=yourdomain.com.au
BACKUPPCPATH=/path/to/data/directory
BACKUPPCLOG=$BACKUPPCPATH/log/LOG
notify=$...@${domain}

### If $2 is a 1, backup completed.

if [ $2 = 1 ]
then
   /bin/mail -sBackup of $3 has Completed Successfully. ${NOTIFY}  
/usr/local/bin/endbkpemail.txt
   /bin/echo $varDate Backup Completed E-mail sent to:(' $1' ) 
 $BACKUPPCLOG
else
   /bin/mail -sBackup of $3 has Failed. Contact your System Admin. 
${NOTIFY}  /usr/local/bin/endbkpemail.txt
   /bin/echo $varDate 'Exit code:(' $2 ') Asset Tag:(' $3')  E-Mail:(' 
$1' )' $BACKUPPCLOG
fi
-

That should give you a good starting point.

Regards,

Les

--
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Arguments for --filter don't get proper escaping

2009-01-29 Thread Thomas Karcher
Hi Holger,

   I observed that with an rsyncd as backup partner,
   everything seems to work fine, but with a command line rsync, it's odd:
   
   Running: /usr/bin/sudo /usr/bin/rsync --server --sender --numeric-ids
   --perms --owner --group -D --links --hard-links --times
   --block-size=2048 --recursive --checksum-seed=32761 --one-file-system
   --filter=:-\\\ /nobackup.txt . /

 Actually, you need to use '$argList' instead of '$argList+' in
 $Conf{RsyncClientCmd} (and $Conf{RsyncClientRestoreCmd}). You are not
 passing the arguments through a shell (as you are when using 'ssh'), 
 so you don't want

For non-ssh-rsync-clients, that did the trick! Thank you!

 any escaping. That also means you should remove the backslash you apparently
 have in there now - contrary to your statement above. The entry in RsyncArgs
 should read '--filter=:- /nobackup.txt'. It is passed as one argument to
 'sudo', which will pass it on as one argument to 'rsync', I believe. Quite

I re-checked: There is no \ in my config.pl or host.pl in the
--filter statement! It is as I described. (The output comes from manual
command line backup.)

But arguments are handled differently depending on whether I use rsync
locally, rsync via ssh or rsyncd. So carefully applying the + does the
trick after all.


Thank you,
Thomas




signature.asc
Description: This is a digitally signed message part
--
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Arguments for --filter don't get proper escaping

2009-01-29 Thread Juergen Harms
  Is your space character different from the one above (other than
being escaped
For a single filter statement there is no diffence whatever - the 
advantage comes if you add additional filter statements - if Thomas 
starts setting things up, why not in a way that scales?

Sorry for the other two issues - I did a bad job in rapidly copy/pasting 
an example. But: where is the problem storing a filter-file next to the 
other configuration data?

--
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Problem on Etch

2009-01-29 Thread Carl Wilhelm Soderstrom
On 01/29 01:25 , John wrote:
 *Worked* fine I must say because I dist-upgraded etch and got back the old 
 2.1.2-6 version - apparently I forgot to apt-pin the package to backports..
 
 So now my installation is messed up badly!
 
 Before I make any further changes to the server, has anyone previous 
 experience in this situtation? Any other tips as how to restore my previous 
 setup?

Can't you just install the new version again?
is your config.pl clobbered, or is the problem just that the new file
doesn't work with the old version?

-- 
Carl Soderstrom
Systems Administrator
Real-Time Enterprises
www.real-time.com

--
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] how to examine the actual transfered Size/MB of a running BackupPC_dump

2009-01-29 Thread Matthias Meyer
Is it possible to examine how many data a running BackupPC_dump has copied
from client to server until now?

Thanks
Matthias
-- 
Don't Panic


--
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] When will backuppc begin pooling?

2009-01-29 Thread Brian Woodworth
I installed backuppc a week ago and currently have 2 full backups and 6
incremental backups for 2 Windows machines and one full backup for another
windows machine.  I have not seen and pooling happen yet.  on the Status
page it says this:

Other info:

   - 0 pending backup requests from last scheduled wakeup,
   - 0 pending user backup requests,
   - 0 pending command requests,
   - Pool is 0.00GB comprising 0 files and 0 directories (as of 1/29 01:00),

   - Pool hashing gives 0 repeated files with longest chain 0,
   - Nightly cleanup removed 0 files of size 0.00GB (around 1/29 01:00),
   - Pool file system was recently at 83% (1/29 18:59), today's max is 83%
   (1/29 17:08) and yesterday's max was 82%.


Is there a configuration I need in order to enable pooling or is it because
I have don't have enough backups yet?  i left most of the config as
default.  Any info would be appreciated.


-- 
Brian Woodworth
bwoodwo...@gmail.com
--
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] When will backuppc begin pooling?

2009-01-29 Thread Les Mikesell
Brian Woodworth wrote:
 I installed backuppc a week ago and currently have 2 full backups and 6
 incremental backups for 2 Windows machines and one full backup for another
 windows machine.  I have not seen and pooling happen yet.  on the Status
 page it says this:

All duplicate files should be pooled even if found in the first run.  Do 
you have the cpool and pc directories on the same filesystem?   Are 
there link errors in the logs for the backup runs?

-- 
   Les Mikesell
lesmikes...@gmail.com


--
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] When will backuppc begin pooling?

2009-01-29 Thread Brian Woodworth
cpool and pc directories are on the same file system.  The only errors in
the Xfer log is Device or Resource Busy (16) for some files.

On Thu, Jan 29, 2009 at 7:23 PM, Les Mikesell lesmikes...@gmail.com wrote:

 Brian Woodworth wrote:
  I installed backuppc a week ago and currently have 2 full backups and 6
  incremental backups for 2 Windows machines and one full backup for
 another
  windows machine.  I have not seen and pooling happen yet.  on the Status
  page it says this:

 All duplicate files should be pooled even if found in the first run.  Do
 you have the cpool and pc directories on the same filesystem?   Are
 there link errors in the logs for the backup runs?

 --
   Les Mikesell
lesmikes...@gmail.com



 --
 This SF.net email is sponsored by:
 SourcForge Community
 SourceForge wants to tell your story.
 http://p.sf.net/sfu/sf-spreadtheword
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/




-- 
Brian Woodworth
bwoodwo...@gmail.com
--
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] When will backuppc begin pooling?

2009-01-29 Thread Les Mikesell
Brian Woodworth wrote:
 cpool and pc directories are on the same file system.  The only errors in
 the Xfer log is Device or Resource Busy (16) for some files.

Are files showing up at all in cpool?  What is the link count on files 
you know are duplicated?  ls -l will show it, and it should be the 
number of duplicates plus one for the hashed name under cpool.  Does the 
file system type support hard links, and does it have free inodes (df -i)?

-- 
   Les Mikesell
lesmikes...@gmail.com



--
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] When will backuppc begin pooling?

2009-01-29 Thread Les Mikesell
Brian Woodworth wrote:
 the cpool directory is completely empty.  ls -l returns 'total 0'.  The file
 system type is ext3 and only 1% of inodes are in use.

If you have compression off you could be using pool instead of cpool, 
but the directory should at least be populated with several levels of 
subdirectories that are used to speed the lookup of the hashed 
filenames.  Perhaps something went wrong during the install.  Does the 
backuppc user have write access everywhere - and is SELinux involved?

-- 
   Les Mikesell
lesmikes...@gmail.com



--
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] When will backuppc begin pooling?

2009-01-29 Thread Craig Barratt
Brian writes:

  *   0 pending backup requests from last scheduled wakeup,
  *   0 pending user backup requests,
  *   0 pending command requests,
  *   Pool is 0.00GB comprising 0 files and 0 directories (as of 1/29 01:00),
  *   Pool hashing gives 0 repeated files with longest chain 0,
  *   Nightly cleanup removed 0 files of size 0.00GB (around 1/29 01:00),
  *   Pool file system was recently at 83% (1/29 18:59), today's max is 83% 
 (1/29 17:08) and yesterday's max was 82%.

It looks like BackupPC_nightly is failing to traverse the pool.
This could be due to a bug in IO::Dirent that causes it to fail
on certain file systems.  There is a test in 3.1.0 to check if
IO::Dirent works, but it checks ., not $TopDir.  That bug is
fixed in CVS.

Do you have IO::Dirent installed and is your pool on XFS?

If so, change this line in lib/BackupPC/Lib.pm:

$IODirentOk = 1;

to:

$IODirentOk = 0;

Craig

--
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] When will backuppc begin pooling?

2009-01-29 Thread Brian Woodworth
Maybe I am blind, but I can't find the Lib.pm file anywhere.  I even did a
find on my whole system for Lib.pm and came up with nothing.

On Thu, Jan 29, 2009 at 11:27 PM, Craig Barratt 
cbarr...@users.sourceforge.net wrote:

 Brian writes:

   *   0 pending backup requests from last scheduled wakeup,
   *   0 pending user backup requests,
   *   0 pending command requests,
   *   Pool is 0.00GB comprising 0 files and 0 directories (as of 1/29
 01:00),
   *   Pool hashing gives 0 repeated files with longest chain 0,
   *   Nightly cleanup removed 0 files of size 0.00GB (around 1/29 01:00),
   *   Pool file system was recently at 83% (1/29 18:59), today's max is
 83% (1/29 17:08) and yesterday's max was 82%.

 It looks like BackupPC_nightly is failing to traverse the pool.
 This could be due to a bug in IO::Dirent that causes it to fail
 on certain file systems.  There is a test in 3.1.0 to check if
 IO::Dirent works, but it checks ., not $TopDir.  That bug is
 fixed in CVS.

 Do you have IO::Dirent installed and is your pool on XFS?

 If so, change this line in lib/BackupPC/Lib.pm:

$IODirentOk = 1;

 to:

$IODirentOk = 0;

 Craig




-- 
Brian Woodworth
bwoodwo...@gmail.com
--
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/