Re: [BackupPC-users] Not backing up more that first module/directory (rsyncd)

2009-04-15 Thread Boniforti Flavio
 As long as the other ssh has gone away and is no longer 
 listening on the
   local forwarding port you should be able to start another instance. 
 It if is a problem you could wrap the ssh command in a script 
 that sleeps a few seconds first to make sure the previous run 
 has time to close down.

Just wanted to confirm that the DumpPreUserShare has solved my
connection gap troubles.

Sadly, I *still* have to fight with the transfer of big files, which
doesn't succeed! :-/

--
This SF.net email is sponsored by:
High Quality Requirements in a Collaborative Environment.
Download a free trial of Rational Requirements Composer Now!
http://p.sf.net/sfu/www-ibm-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Not backing up more that first module/directory (rsyncd)

2009-04-14 Thread Boniforti Flavio

 Alternatively, set up the tunnel in DumpPreShareCmd instead 
 of DumpPreUserCmd.
 Then you will be using a new tunnel for each share.

May I be using the same TCP ports for it?
That is my only concern right now...

--
This SF.net email is sponsored by:
High Quality Requirements in a Collaborative Environment.
Download a free trial of Rational Requirements Composer Now!
http://p.sf.net/sfu/www-ibm-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Not backing up more that first module/directory (rsyncd)

2009-04-14 Thread Boniforti Flavio

 I have (successfully) been using a script to do this for the 
 past few months. I copied this from somewhere else (the wiki 
 perhaps) and have tuned locally since then...
 
 I created a script in /etc/backuppc/scripts/hostname.sh 
 #!/bin/bash TERM=vt100 /usr/bin/screen -d -m -S hosttunnel 
 /usr/bin/ssh -o
 ServerAliveInterval=15 -o ServerAliveCountMax=10 -q -x -C -L
 1516:127.0.0.1:873 -l root host
 /bin/sleep 20
 
 # NOTE: sleep 20? we needed to introduce a small delay to 
 ensure the tunnel was fully established before rsync started 
 # NOTE: -S hosttunnel helps us identify the process - so we 
 can kill it when the backups are finished # NOTE: This could 
 be done with dtach instead of screen # NOTE: the string host 
 should be replaced with the hostname
 
 $Conf{ClientNameAlias} = '127.0.0.1';
 $Conf{DumpPreUserCmd}='/etc/backuppc/scripts/host.sh';
 $Conf{DumpPostUserCmd}='/usr/bin/pkill -u backuppc -f host'; 
 $Conf{RestorePreUserCmd}='/etc/backuppc/scripts/host.sh';
 $Conf{RestorePostUserCmd}='/usr/bin/pkill -u backuppc -f 
 host'; $Conf{RsyncdClientPort}='1516'; # This 1516 matches 
 the 1516 in the above script, each tunneled host should use a 
 unique port number so that parallel backups don't interfere 
 with each other
 
 
 Hope that helps, if anyone sees something wrong with the 
 above config, please let me know.

I'm interested in understanding how and why it is working in your
environment: can you explain?

Thanks...

--
This SF.net email is sponsored by:
High Quality Requirements in a Collaborative Environment.
Download a free trial of Rational Requirements Composer Now!
http://p.sf.net/sfu/www-ibm-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Not backing up more that first module/directory (rsyncd)

2009-04-14 Thread Les Mikesell
Boniforti Flavio wrote:
 Alternatively, set up the tunnel in DumpPreShareCmd instead 
 of DumpPreUserCmd.
 Then you will be using a new tunnel for each share.
 
 May I be using the same TCP ports for it?
 That is my only concern right now...

As long as the other ssh has gone away and is no longer listening on the 
  local forwarding port you should be able to start another instance. 
It if is a problem you could wrap the ssh command in a script that 
sleeps a few seconds first to make sure the previous run has time to 
close down.

But, a generic vpn would make life simple.

-- 
   Les Mikesell
lesmikes...@gmail.com

--
This SF.net email is sponsored by:
High Quality Requirements in a Collaborative Environment.
Download a free trial of Rational Requirements Composer Now!
http://p.sf.net/sfu/www-ibm-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Not backing up more that first module/directory (rsyncd)

2009-04-14 Thread Les Mikesell
Boniforti Flavio wrote:
 But, a generic vpn would make life simple.
 
 I would really *love* to set up a VPN: can you help? I already have VPN
 configured on the server (Windows 2003 SBS SP2), I (think) I just need
 some command-line linux client, which would open the VPN connection
 (DumpPreUserCmd) and having transferred all the data, simply close the
 gate.
 
 Any suggestions?

What's the problem with leaving it up all the time so you can manage the 
target(s) over it as well as doing backups?  You'd normally want your 
backuppc server to be fairly secure anyway - or you might want to route 
through a different VPN server handling routes for the whole LANs on 
each side.  If you do need to start and stop it, the technique will 
depend on the vpn program.  I happen to like openvpn running in peer to 
peer mode where you end up with a process and tunnel interface per 
instance.  So you'd leave the target side up and listening all the time 
and the backuppc server side would start the program with the relevant 
config file to open the tunnel and either kill it or ifconfig down the 
tunnel interface to stop it.

-- 
   Les Mikesell
lesmikes...@gmail.com

--
This SF.net email is sponsored by:
High Quality Requirements in a Collaborative Environment.
Download a free trial of Rational Requirements Composer Now!
http://p.sf.net/sfu/www-ibm-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Not backing up more that first module/directory (rsyncd)

2009-04-14 Thread Boniforti Flavio



Il 14.04.09 16:55, Les Mikesell l...@futuresource.com ha scritto:

 I happen to like openvpn running in peer to
 peer mode where you end up with a process and tunnel interface per
 instance. 

OK, so I will take a look at openvpn. Leaving the VPN open is not a problem,
of course...

Thanks,
F.


--
This SF.net email is sponsored by:
High Quality Requirements in a Collaborative Environment.
Download a free trial of Rational Requirements Composer Now!
http://p.sf.net/sfu/www-ibm-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Not backing up more that first module/directory (rsyncd)

2009-04-14 Thread Les Mikesell
Boniforti Flavio wrote:
 
 
 Il 14.04.09 16:55, Les Mikesell l...@futuresource.com ha scritto:
 
 I happen to like openvpn running in peer to
 peer mode where you end up with a process and tunnel interface per
 instance. 
 
 OK, so I will take a look at openvpn. Leaving the VPN open is not a problem,
 of course...

I'm on the openvpn mail list too, so ask there if you have any problems. 
  For a simple host to host connection you don't need to deal with any 
routing issues - just give each end adjacent private IP addresses that 
are in networks you don't use anywhere else.  If you want to route for 
the rest of the LAN it gets a little more complicated.

-- 
   Les Mikesell
lesmikes...@gmail.com



--
This SF.net email is sponsored by:
High Quality Requirements in a Collaborative Environment.
Download a free trial of Rational Requirements Composer Now!
http://p.sf.net/sfu/www-ibm-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Not backing up more that first module/directory (rsyncd)

2009-04-13 Thread Holger Parplies
Hi,

Adam Goryachev wrote on 2009-04-13 13:15:40 +1000 [Re: [BackupPC-users] Not 
backing up more that first?module/directory (rsyncd)]:
 Holger Parplies wrote:
  Boniforti Flavio wrote on 2009-04-12 10:20:12 +0200 [[BackupPC-users] Not 
  backing up more that first module/directory (rsyncd)]:
 
  What I¹m now experiencing is that after having correctly completed first
  module/directory, it doesn¹t connect anymore to the subsequent one.
 [...]
  I agree that it is the ssh tunnel closing when the last (= only) tunnelled
  connection closes.
 [...]
 
 I have (successfully) been using a script to do this for the past few
 months. I copied this from somewhere else (the wiki perhaps) and have
 tuned locally since then...
 
 I created a script in /etc/backuppc/scripts/hostname.sh
 #!/bin/bash
 TERM=vt100
 /usr/bin/screen -d -m -S hosttunnel /usr/bin/ssh -o
 ServerAliveInterval=15 -o ServerAliveCountMax=10 -q -x -C -L
 1516:127.0.0.1:873 -l root host
 /bin/sleep 20

I fail to see how running the 'ssh host sleep 20' inside screen would help.
It only seems to prevent an error from ssh from being noticed by BackupPC.
Are you really referring to a *multiple share* backup?

 [...]
 $Conf{DumpPreUserCmd}='/etc/backuppc/scripts/host.sh';

Wasn't that 'hostname.sh'? ;-)

 $Conf{DumpPostUserCmd}='/usr/bin/pkill -u backuppc -f host';

Hmm, how about 'screen -S hosttunnel -X quit'?

Regards,
Holger

--
This SF.net email is sponsored by:
High Quality Requirements in a Collaborative Environment.
Download a free trial of Rational Requirements Composer Now!
http://p.sf.net/sfu/www-ibm-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Not backing up more that first module/directory (rsyncd)

2009-04-13 Thread Adam Goryachev
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Holger Parplies wrote:
 Hi,
 
 Adam Goryachev wrote on 2009-04-13 13:15:40 +1000 [Re: [BackupPC-users] Not 
 backing up more that first?module/directory (rsyncd)]:
 Holger Parplies wrote:
 Boniforti Flavio wrote on 2009-04-12 10:20:12 +0200 [[BackupPC-users] Not 
 backing up more that first module/directory (rsyncd)]:
 What I¹m now experiencing is that after having correctly completed first
 module/directory, it doesn¹t connect anymore to the subsequent one.
 [...]
 I agree that it is the ssh tunnel closing when the last (= only) tunnelled
 connection closes.
 [...]

 I have (successfully) been using a script to do this for the past few
 months. I copied this from somewhere else (the wiki perhaps) and have
 tuned locally since then...

 I created a script in /etc/backuppc/scripts/hostname.sh
 #!/bin/bash
 TERM=vt100
 /usr/bin/screen -d -m -S hosttunnel /usr/bin/ssh -o
 ServerAliveInterval=15 -o ServerAliveCountMax=10 -q -x -C -L
 1516:127.0.0.1:873 -l root host
 /bin/sleep 20
 
 I fail to see how running the 'ssh host sleep 20' inside screen would help.
 It only seems to prevent an error from ssh from being noticed by BackupPC.
 Are you really referring to a *multiple share* backup?

Neither do I, but yes, it is a multi-share backup. edrive is done
first, which is 100G or so of images and then cdrive is done...

 [...]
 $Conf{DumpPreUserCmd}='/etc/backuppc/scripts/host.sh';
 
 Wasn't that 'hostname.sh'? ;-)

well, neither was the literal text host.sh or hostname.sh. I use a
different scrpt for each host, but didn't want to show my actual
hostname being referred to, so I made it generic

 $Conf{DumpPostUserCmd}='/usr/bin/pkill -u backuppc -f host';
 
 Hmm, how about 'screen -S hosttunnel -X quit'?

I'm not sure, I know very little about screen, I really just copied the
scripts from somewhere else. Perhaps screen can't quit if the ssh
command is still running ?
Perhaps the fact that this still works for me might also be why I seem
to get hanging bash/ssh processes on the remote side.

I might give your suggestion a try, but I'm not sure it will help...


Regards,
Adam
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

iEYEARECAAYFAknjwKsACgkQGyoxogrTyiUEOgCffDBRx0/qFxsMVbOnaQEgUWp0
VZ8An3YNxEsVCGwAGjZufbsZeh69zXhh
=C1yY
-END PGP SIGNATURE-

--
This SF.net email is sponsored by:
High Quality Requirements in a Collaborative Environment.
Download a free trial of Rational Requirements Composer Now!
http://p.sf.net/sfu/www-ibm-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Not backing up more that first module/directory (rsyncd)

2009-04-12 Thread Holger Parplies
Hi,

Boniforti Flavio wrote on 2009-04-12 10:20:12 +0200 [[BackupPC-users] Not 
backing up more that first module/directory (rsyncd)]:
 I¹ve succesfully set up BackupPC to get data from remote Windows hosts with
 ssh tunnel and rsyncd.
 
 What I¹m now experiencing is that after having correctly completed first
 module/directory, it doesn¹t connect anymore to the subsequent one.
 Here log excerpt:
 
 [...]
 incr backup started back to 2009-04-10 17:33:03 (backup #2) for directory
 ExchBkp
 Error connecting to rsync daemon at localhost:8874: inet connect:
 Connessione rifiutata
 Got fatal error during xfer (inet connect: Connessione rifiutata)
 Backup aborted (inet connect: Connessione rifiutata)
 
 I thought it was the ssh tunnel being closed, therefore I changed the ³sleep
 20² command to ³sleep 60², but didn¹t change anything.

I agree that it is the ssh tunnel closing when the last (= only) tunnelled
connection closes. Changing the sleep parameter to 60 should only make a
difference if the first module takes less than 60 seconds to back up.
Remember that it's not the time between the two shares you're gapping, it's
the time between DumpPreUserCmd and the first connection. After running the
command (sleep 20), ssh waits for any tunnelled connections to close and then
terminates.

 What do you people suggest?

I haven't got a working implementation, but I'd suggest running a command that
does *not* complete (ever, basically) over ssh in the DumpPreUserCmd and
killing that command (or the ssh) in the DumpPostUserCmd. That would also get
rid of the race condition you currently have: if the backup, for some reason,
takes more than 20 seconds to start (from the time the DumpPreUserCmd is run),
it will fail. That is probably highly unlikely, but it *is* possible.

Alternatively, set up the tunnel in DumpPreShareCmd instead of DumpPreUserCmd.
Then you will be using a new tunnel for each share.

Regards,
Holger

--
This SF.net email is sponsored by:
High Quality Requirements in a Collaborative Environment.
Download a free trial of Rational Requirements Composer Now!
http://p.sf.net/sfu/www-ibm-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Not backing up more that first module/directory (rsyncd)

2009-04-12 Thread Adam Goryachev
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Holger Parplies wrote:
 Hi,
 
 Boniforti Flavio wrote on 2009-04-12 10:20:12 +0200 [[BackupPC-users] Not 
 backing up more that first module/directory (rsyncd)]:
 I¹ve succesfully set up BackupPC to get data from remote Windows hosts with
 ssh tunnel and rsyncd.

 What I¹m now experiencing is that after having correctly completed first
 module/directory, it doesn¹t connect anymore to the subsequent one.
 Here log excerpt:

 [...]
 incr backup started back to 2009-04-10 17:33:03 (backup #2) for directory
 ExchBkp
 Error connecting to rsync daemon at localhost:8874: inet connect:
 Connessione rifiutata
 Got fatal error during xfer (inet connect: Connessione rifiutata)
 Backup aborted (inet connect: Connessione rifiutata)

 I thought it was the ssh tunnel being closed, therefore I changed the ³sleep
 20² command to ³sleep 60², but didn¹t change anything.
 
 I agree that it is the ssh tunnel closing when the last (= only) tunnelled
 connection closes. Changing the sleep parameter to 60 should only make a
 difference if the first module takes less than 60 seconds to back up.
 Remember that it's not the time between the two shares you're gapping, it's
 the time between DumpPreUserCmd and the first connection. After running the
 command (sleep 20), ssh waits for any tunnelled connections to close and then
 terminates.
 
 What do you people suggest?
 
 I haven't got a working implementation, but I'd suggest running a command that
 does *not* complete (ever, basically) over ssh in the DumpPreUserCmd and
 killing that command (or the ssh) in the DumpPostUserCmd. That would also get
 rid of the race condition you currently have: if the backup, for some reason,
 takes more than 20 seconds to start (from the time the DumpPreUserCmd is run),
 it will fail. That is probably highly unlikely, but it *is* possible.
 
 Alternatively, set up the tunnel in DumpPreShareCmd instead of DumpPreUserCmd.
 Then you will be using a new tunnel for each share.

I have (successfully) been using a script to do this for the past few
months. I copied this from somewhere else (the wiki perhaps) and have
tuned locally since then...

I created a script in /etc/backuppc/scripts/hostname.sh
#!/bin/bash
TERM=vt100
/usr/bin/screen -d -m -S hosttunnel /usr/bin/ssh -o
ServerAliveInterval=15 -o ServerAliveCountMax=10 -q -x -C -L
1516:127.0.0.1:873 -l root host
/bin/sleep 20

# NOTE: sleep 20? we needed to introduce a small delay to ensure the
tunnel was fully established before rsync started
# NOTE: -S hosttunnel helps us identify the process - so we can kill
it when the backups are finished
# NOTE: This could be done with dtach instead of screen
# NOTE: the string host should be replaced with the hostname

$Conf{ClientNameAlias} = '127.0.0.1';
$Conf{DumpPreUserCmd}='/etc/backuppc/scripts/host.sh';
$Conf{DumpPostUserCmd}='/usr/bin/pkill -u backuppc -f host';
$Conf{RestorePreUserCmd}='/etc/backuppc/scripts/host.sh';
$Conf{RestorePostUserCmd}='/usr/bin/pkill -u backuppc -f host';
$Conf{RsyncdClientPort}='1516';
# This 1516 matches the 1516 in the above script, each tunneled host
should use a unique port number so that parallel backups don't interfere
with each other


Hope that helps, if anyone sees something wrong with the above config,
please let me know.

Regards,
Adam
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

iEYEARECAAYFAknirlEACgkQGyoxogrTyiVFigCdE7dh8zfqZRXB/CVQnkxkRau/
HTIAoJaOKk7Cl/PHLVv21ZrlivCeTcUT
=Ar8Z
-END PGP SIGNATURE-

--
This SF.net email is sponsored by:
High Quality Requirements in a Collaborative Environment.
Download a free trial of Rational Requirements Composer Now!
http://p.sf.net/sfu/www-ibm-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/