Re: [BackupPC-users] ssh tunnel with DumpPreUserCmd

2006-02-26 Thread Craig Barratt
Tristan Krakau writes:

> due to the rsync-cygwin-ssh problem when backing up Windows clients
> using 'rsync' as transfer, I tried to use 'rsyncd' through a ssh tunnel
> instead (since the Windows client can only be accessed via ssh).
> 
> I found other threads dealing with this topic (e.g.
> http://sourceforge.net/mailarchive/message.php?msg_id=11482919), and all
> the hints there seemed to work fine:
> 
> - a ssh tunnel from the backuppc server to the client is created
> - rsync is redirected to the localhost:port and this way connects to
> rsyncd-shares on the client
> - after backing up the host the tunnel is killed again
> 
> However, this only works when I manually setup the ssh-tunnel, call
> BackupPC_dump  and tear down the tunnel afterwards.
> 
> But if I want the tunnel to be set-up by
> 
> $Conf{DumpPreUserCmd} = 'ssh -L 5009:localhost:873 -N -f
> [EMAIL PROTECTED]';
> 
> the tunnel is created - and the dump is waiting... it simply does not
> continue until the ssh-tunnel is killed (from another shell e.g.). Then
> the backup of course fails because the tunnel is not there and rsyncd
> cannot be contacted.
> 
> I also tried putting the ssh -L ... command in another script:
> 
> host_prepare.sh:
> ssh -L 5009:localhost:873 -N -f [EMAIL PROTECTED]
> echo Tunnel to host was created
> --
> and it shows that the echo command, like any other command after the
> ssh-command is still executed, which tells that ssh is started in the
> background, but the dump will pause after the script has finished.
> 
> Also, using & instead of the -f option with ssh has no effect.
> 
> So my question is: How can I make the DumpPreUserCmd start the
> ssh-tunnel in the background and return so the dump can begin while the
> tunnel is there?
> 
> I think it could have something to do with the way Perl executes
> commands, maybe it always waits for all child-processes to end?
>
> I really hope someone could give me a hint how to solve this!

Yes, it must have to do with how perl executes these commands.
I'm not near a linux machine right now, so I can't test this.

First, I assume you tried "&" inside host_prepare.sh, since 
$Conf{DumpPreUserCmd} is exec'ed directly rather than via
a shell.

The perl code uses the open(F, "-|") form of fork to run you command.
That pipes stdout of the children to perl.  So the issue could be
that since stdout of the child (and its children) is still open,
perl continues to wait.

Inside the shell (assuming I've got the /bin/sh syntax right),
I recommend trying to redirect ssh's stdout and stderr to
/dev/null, eg:

ssh -L 5009:localhost:873 -N -f [EMAIL PROTECTED] 1>/dev/null 2>/dev/null

or

ssh -L 5009:localhost:873 -N [EMAIL PROTECTED] 1>/dev/null 2>/dev/null &

Craig


---
This SF.Net email is sponsored by xPML, a groundbreaking scripting language
that extends applications into web and mobile media. Attend the live webcast
and join the prime developer group breaking into this new coding territory!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid0944&bid$1720&dat1642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backuppc Error: Unable to read 4 bytes

2006-02-26 Thread Craig Barratt
Rodrigo Real writes:

> I am having a problem with a host that must be backed-up with
> backuppc. This host runs an ssh server on the port 222, so I changed
> the variable $Conf{RsyncClientCmd} to meet this requirement, but when
> I try to run the backup on this host, I receive the error: "Unable to
> read 4 bytes". Below is the configuration I am using for this
> variable, in this host specific config.pl.
> 
> $Conf{RsyncClientCmd} = '$sshPath -p 222 -q -x -l root $host $rsyncPath 
> $argList+';
> 
> The other hosts which I backup on this server are working just
> fine. Additionally, I can manually connect through ssh to this
> problematic host. So I think that there is nothing wrong with ssh or
> ssh-key.
> 
> Had anyone experienced a problem like this?

What happens when you connect in exactly the same way as the
backuppc user:

su backuppc
ssh -p 222 -q -x -l root HOST whoami

Do you get prompted for a password?  Do you see any extraneous
output other than "root"?

Craig


---
This SF.Net email is sponsored by xPML, a groundbreaking scripting language
that extends applications into web and mobile media. Attend the live webcast
and join the prime developer group breaking into this new coding territory!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid0944&bid$1720&dat1642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Transforming a filled incr backup into a full backup

2006-02-26 Thread Craig Barratt
Nicolas MONNET writes:

> To save on bandwidth (I'm using backuppc to backup servers from a 
> datacenter to my office), I want to run incremental backups as much as 
> possible.
> 
> I believe it should'nt be too hard to write a tool to transform an 
> incremental backup into a full backup.
> 
> * I guess the daemon has to be turned off first
> * Update the "backups" file
> 
> I've done this by hand, it seems to work, with the exception of the file 
> numbers/size being wrong.
> 
> It's not too much of a problem, I just want to know if there's gonna be 
> an issue with the pool when the last true full backup gets deleted?

Yes, you are describing one step in what is required to make
"perpetual incrementals" work in BackupPC.

Be aware that except for rsync, incrementals don't pick everything
up correctly since they just check mtimes, so deleted files,
unziped files with old mtimes, renamed files etc aren't detected.
Rsync does the right thing, since all metadata is checked,
including present/deleted.  So "perpetual incrementals" are
not recommended for anything other than rsync.

But that said, rsync fulls don't send much data after the
first time, since only checksums are exchanged.

As you know, BackupPC needs a full (or filled) backup to "fill
in" an incremental for browse/restore.  There is a flag in the
backups file for "filled" (actually "noFill").  For fulls,
noFill = 0, and for incrementals noFill = 1.  A full backup can
be deleted if the "dependent" incrementals are first filled in
the manner you describe.  The field "fillFromNum" should be
set to which other backup is used to fill in an incemental.
It could be a chain of incrementals until a filled backup
is hit.

Currently the logic for deciding which backup to use as
the reference for an incremental is to simply find the most
recent full.  That logic should be changed to find the
most recent filled backup.

For 3.0 I'm considering whether to implement multi-level
incrementals (eg: incrementals can depend upon other
incrementals, instead of the last full).  Most of the
pieces are in place, but it's not finished yet.

The second step, which probably won't make 3.0, is doing the
filling you describe to "perpetual incrementals" can be
supported.

One other remark: in addition to hardlinking to fill an incremental
you also need to merge the attrib files.  Without that new files
won't appear in the browser and deleted files won't be tagged.
The function FillIncr in bin/BackupPC_link should do all of that,
but it's a while since it has been tested.

Craig


---
This SF.Net email is sponsored by xPML, a groundbreaking scripting language
that extends applications into web and mobile media. Attend the live webcast
and join the prime developer group breaking into this new coding territory!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid0944&bid$1720&dat1642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


[BackupPC-users] Re: Backup failed

2006-02-26 Thread Craig Barratt
KOUAO aketchi writes:

> I meet a problem when  i backup a pc which has its ip address changed.
> When one of my windows pc has its ip address changed , backuppc sends
> a message :" inet connect , connection refused" . What is the reason
> of this failure?

If you mean the IP address change occurs during a backup, then
this is expected.  TCP connections won't survive a chance in IP
address, so a backup will fail if the pc changes its IP address
during a backup.

> Sometimes, this message appears : unknown host while the pc
> exists and is on network.  Could you give me some reasons.  Thanks

You need to read the documentation to understand how BackupPC finds
pcs to backup and discovers or looks up their IP address.  You can
run BackupPC_dump with the -v option to see what commands it runs
and to see which one fails:

su backuppc
BackupPC_dump -v -f HOST

Craig


---
This SF.Net email is sponsored by xPML, a groundbreaking scripting language
that extends applications into web and mobile media. Attend the live webcast
and join the prime developer group breaking into this new coding territory!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid0944&bid$1720&dat1642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] no xfer log

2006-02-26 Thread Craig Barratt
"Khaled Hussain" writes:

> For one of my XP hosts I dont seem to be generating Xfer log files only a
> LOG file...I am getting 'child exitted prematurely' error after 1 hour since
> backup starts for this host in the LOG file and that's all it says - I
> understand the Xfer log is useful for debugging info but why does this not
> exist?

Are you looking in the per-PC directory (pc/HOST)?

Have you increased your $Conf{ClientTimeout}?

Craig


---
This SF.Net email is sponsored by xPML, a groundbreaking scripting language
that extends applications into web and mobile media. Attend the live webcast
and join the prime developer group breaking into this new coding territory!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid0944&bid$1720&dat1642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] MULTIPLE HOSTS FROM MACHINE

2006-02-26 Thread Craig Barratt
"Nick Barton" writes:

> Sorry if this question has already been answered, I think it is an easy
> fix but I am just not finding it anywhere. I need to be able to backup
> multiple host machines from one backuppc computer on different schedules
> through out the week, I think a total of 15 servers. Samba is my
> transfer method, how do I configure my config.pl file to do this for
> multiple hosts, and have a different schedule for each machine, say 3
> machines on Monday, 3 on Tuesday and so on. I tried creating separate
> config files and putting them in the folder for each machine under my
> /pc directory but it seems to call just the config.pl from the /conf
> directory when starting a backup. 

BackupPC doesn't provide an easy way to force full backups
on particular days of the week.  But that's because there
is already a good way to do it.

Using BackupPC_serverMesg you can manually start a full (by emulating
what the CGI interface does).  Look on the list for how to do this.
So you can use cron with BackupPC_serverMesg to run a full once per
week when you want.  You should increase $Conf{FullPeriod} a little
(eg: >10 days) so the automatic backups don't try to start at the
same time.  If you keep $Conf{IncrPeriod} the same (eg: daily) it
will continue to automatically run the incrementals each day.

To avoid a race between a cron full and the automatic incrementals,
simply chose an hour/minute in cron several minutes before the regular
BackupPC wakeup.  That way the cron full will already be running when
BackupPC checks what to do.

Craig


---
This SF.Net email is sponsored by xPML, a groundbreaking scripting language
that extends applications into web and mobile media. Attend the live webcast
and join the prime developer group breaking into this new coding territory!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid0944&bid$1720&dat1642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] tar.gz: md4 doesn't match

2006-02-26 Thread Craig Barratt
Sim writes:

> Some time I have error report, from "tar.gz" files.
> 
> ( BackupPc download it with rsync )
> 
> You can see this report with "One Error".
> 
> Connected to srv1.lan:873, remote version 29
> Connected to module backup
> Sending args: --server --sender --numeric-ids --perms --owner --group
> --devices --links --times --block-size=2048 --recursive . .
> Xfer PIDs are now 12004
> [ saltate 23 righe ]
> var-www.tar.gz: md4 doesn't match: will retry in phase 1; file removed
> [ saltate 4 righe ]
> Done: 26 files, 587554421 bytes

Sometimes the rsync algorithm gets collisions between block digests
(ie: two different blocks actually have the same digest) so the wrong
block is used.  The causes the overall md4 checksum to fail.  A second
pass is used with larger (stronger) block digests, and the file will
be transferred correctly in the second phase.  The only impact is an
increase in transfer time for the second phase.

In more recent rsync protocol versions, the first pass digest
length is dynamic, reducing the chance of this happening.  but
File::RsyncP 0.52 doesn't support this.  The next version will.

Craig


---
This SF.Net email is sponsored by xPML, a groundbreaking scripting language
that extends applications into web and mobile media. Attend the live webcast
and join the prime developer group breaking into this new coding territory!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid0944&bid$1720&dat1642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/