Re: [BackupPC-users] Transforming a filled incr backup into a full backup

2006-02-26 Thread Craig Barratt
Nicolas MONNET writes:

 To save on bandwidth (I'm using backuppc to backup servers from a 
 datacenter to my office), I want to run incremental backups as much as 
 possible.
 
 I believe it should'nt be too hard to write a tool to transform an 
 incremental backup into a full backup.
 
 * I guess the daemon has to be turned off first
 * Update the backups file
 
 I've done this by hand, it seems to work, with the exception of the file 
 numbers/size being wrong.
 
 It's not too much of a problem, I just want to know if there's gonna be 
 an issue with the pool when the last true full backup gets deleted?

Yes, you are describing one step in what is required to make
perpetual incrementals work in BackupPC.

Be aware that except for rsync, incrementals don't pick everything
up correctly since they just check mtimes, so deleted files,
unziped files with old mtimes, renamed files etc aren't detected.
Rsync does the right thing, since all metadata is checked,
including present/deleted.  So perpetual incrementals are
not recommended for anything other than rsync.

But that said, rsync fulls don't send much data after the
first time, since only checksums are exchanged.

As you know, BackupPC needs a full (or filled) backup to fill
in an incremental for browse/restore.  There is a flag in the
backups file for filled (actually noFill).  For fulls,
noFill = 0, and for incrementals noFill = 1.  A full backup can
be deleted if the dependent incrementals are first filled in
the manner you describe.  The field fillFromNum should be
set to which other backup is used to fill in an incemental.
It could be a chain of incrementals until a filled backup
is hit.

Currently the logic for deciding which backup to use as
the reference for an incremental is to simply find the most
recent full.  That logic should be changed to find the
most recent filled backup.

For 3.0 I'm considering whether to implement multi-level
incrementals (eg: incrementals can depend upon other
incrementals, instead of the last full).  Most of the
pieces are in place, but it's not finished yet.

The second step, which probably won't make 3.0, is doing the
filling you describe to perpetual incrementals can be
supported.

One other remark: in addition to hardlinking to fill an incremental
you also need to merge the attrib files.  Without that new files
won't appear in the browser and deleted files won't be tagged.
The function FillIncr in bin/BackupPC_link should do all of that,
but it's a while since it has been tested.

Craig


---
This SF.Net email is sponsored by xPML, a groundbreaking scripting language
that extends applications into web and mobile media. Attend the live webcast
and join the prime developer group breaking into this new coding territory!
http://sel.as-us.falkag.net/sel?cmd=lnkkid0944bid$1720dat1642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


[BackupPC-users] Re: Backup failed

2006-02-26 Thread Craig Barratt
KOUAO aketchi writes:

 I meet a problem when  i backup a pc which has its ip address changed.
 When one of my windows pc has its ip address changed , backuppc sends
 a message : inet connect , connection refused . What is the reason
 of this failure?

If you mean the IP address change occurs during a backup, then
this is expected.  TCP connections won't survive a chance in IP
address, so a backup will fail if the pc changes its IP address
during a backup.

 Sometimes, this message appears : unknown host while the pc
 exists and is on network.  Could you give me some reasons.  Thanks

You need to read the documentation to understand how BackupPC finds
pcs to backup and discovers or looks up their IP address.  You can
run BackupPC_dump with the -v option to see what commands it runs
and to see which one fails:

su backuppc
BackupPC_dump -v -f HOST

Craig


---
This SF.Net email is sponsored by xPML, a groundbreaking scripting language
that extends applications into web and mobile media. Attend the live webcast
and join the prime developer group breaking into this new coding territory!
http://sel.as-us.falkag.net/sel?cmd=lnkkid0944bid$1720dat1642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] no xfer log

2006-02-26 Thread Craig Barratt
Khaled Hussain writes:

 For one of my XP hosts I dont seem to be generating Xfer log files only a
 LOG file...I am getting 'child exitted prematurely' error after 1 hour since
 backup starts for this host in the LOG file and that's all it says - I
 understand the Xfer log is useful for debugging info but why does this not
 exist?

Are you looking in the per-PC directory (pc/HOST)?

Have you increased your $Conf{ClientTimeout}?

Craig


---
This SF.Net email is sponsored by xPML, a groundbreaking scripting language
that extends applications into web and mobile media. Attend the live webcast
and join the prime developer group breaking into this new coding territory!
http://sel.as-us.falkag.net/sel?cmd=lnkkid0944bid$1720dat1642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] MULTIPLE HOSTS FROM MACHINE

2006-02-26 Thread Craig Barratt
Nick Barton writes:

 Sorry if this question has already been answered, I think it is an easy
 fix but I am just not finding it anywhere. I need to be able to backup
 multiple host machines from one backuppc computer on different schedules
 through out the week, I think a total of 15 servers. Samba is my
 transfer method, how do I configure my config.pl file to do this for
 multiple hosts, and have a different schedule for each machine, say 3
 machines on Monday, 3 on Tuesday and so on. I tried creating separate
 config files and putting them in the folder for each machine under my
 /pc directory but it seems to call just the config.pl from the /conf
 directory when starting a backup. 

BackupPC doesn't provide an easy way to force full backups
on particular days of the week.  But that's because there
is already a good way to do it.

Using BackupPC_serverMesg you can manually start a full (by emulating
what the CGI interface does).  Look on the list for how to do this.
So you can use cron with BackupPC_serverMesg to run a full once per
week when you want.  You should increase $Conf{FullPeriod} a little
(eg: 10 days) so the automatic backups don't try to start at the
same time.  If you keep $Conf{IncrPeriod} the same (eg: daily) it
will continue to automatically run the incrementals each day.

To avoid a race between a cron full and the automatic incrementals,
simply chose an hour/minute in cron several minutes before the regular
BackupPC wakeup.  That way the cron full will already be running when
BackupPC checks what to do.

Craig


---
This SF.Net email is sponsored by xPML, a groundbreaking scripting language
that extends applications into web and mobile media. Attend the live webcast
and join the prime developer group breaking into this new coding territory!
http://sel.as-us.falkag.net/sel?cmd=lnkkid0944bid$1720dat1642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] tar.gz: md4 doesn't match

2006-02-26 Thread Craig Barratt
Sim writes:

 Some time I have error report, from tar.gz files.
 
 ( BackupPc download it with rsync )
 
 You can see this report with One Error.
 
 Connected to srv1.lan:873, remote version 29
 Connected to module backup
 Sending args: --server --sender --numeric-ids --perms --owner --group
 --devices --links --times --block-size=2048 --recursive . .
 Xfer PIDs are now 12004
 [ saltate 23 righe ]
 var-www.tar.gz: md4 doesn't match: will retry in phase 1; file removed
 [ saltate 4 righe ]
 Done: 26 files, 587554421 bytes

Sometimes the rsync algorithm gets collisions between block digests
(ie: two different blocks actually have the same digest) so the wrong
block is used.  The causes the overall md4 checksum to fail.  A second
pass is used with larger (stronger) block digests, and the file will
be transferred correctly in the second phase.  The only impact is an
increase in transfer time for the second phase.

In more recent rsync protocol versions, the first pass digest
length is dynamic, reducing the chance of this happening.  but
File::RsyncP 0.52 doesn't support this.  The next version will.

Craig


---
This SF.Net email is sponsored by xPML, a groundbreaking scripting language
that extends applications into web and mobile media. Attend the live webcast
and join the prime developer group breaking into this new coding territory!
http://sel.as-us.falkag.net/sel?cmd=lnkkid0944bid$1720dat1642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Pool Size larger than Harddrive usage

2006-02-28 Thread Craig Barratt
Bryan Penney writes:

 On the status page the pool information is reported as
 
 Pool is 227.53GB comprising 1157013 files and 4369 directories (as of 
 2/28 08:33),
 
 When I run df -h /dev/sda3 (the raid backuppc is on) I get:
 
 FilesystemSize  Used Avail Use% Mounted on
 /dev/sda3 1.9T  222G  1.6T  12% /var/lib/backuppc
 
 This means that the status page reports the pool file being 5G larger 
 than the space being used on the drive.  This backup machine's pool was 
 originally rsync'd off of our old backupPC machine.
 
 This could mean that I am missing some data, some hard links are double 
 counted, or that backupPC is doing some sort of rounding.
 
 Any ideas which one(s) are more likely?

BackupPC uses 1000MB (not 1024MB) for G, while df uses 2^30
(1024MB).  So the BackupPC number will be 2.4% higer.

I should probably change BackupPC to use 2^30 too; I don't know
why I picked 1000 * 2^20 - maybe that's what disk drives use
for raw capacity?

The pool size is the sum of the number of blocks used for each
file in the pool.  In general that should be less than reported
by df since the pc directories contain full directory trees
(just containing hardlinks of course) and Xfer log files that
occupy some disk space.

So your results above make sense if you have copied the pool
but there aren't too many backups saved.

Craig


---
This SF.Net email is sponsored by xPML, a groundbreaking scripting language
that extends applications into web and mobile media. Attend the live webcast
and join the prime developer group breaking into this new coding territory!
http://sel.as-us.falkag.net/sel?cmd=lnkkid0944bid$1720dat1642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Best software to backup BackupPC Server to tape

2006-03-03 Thread Craig Barratt
Dan Pritts writes:

 An idea i had for offsites is to just run rsync against the raw device.
 rsync would need to be patched to allow this, and apparently rsync has
 some issues with very large files.

This should work well.  However, rsync currently doesn't copy
device file contents, it just mknods the device.  A new option
would be needed to make it open/copy device file contents.
Since rsync has an inplace option (avoiding the need for a temp
file) updating a remote raw device should be possible and
practical.

Rsync's issues with large files has been solved by dynamic
sizing of the first-pass digests.  Also, another optimization
would be to only match blocks on whole sector boundaries,
not on byte boundaries.

Craig


---
This SF.Net email is sponsored by xPML, a groundbreaking scripting language
that extends applications into web and mobile media. Attend the live webcast
and join the prime developer group breaking into this new coding territory!
http://sel.as-us.falkag.net/sel?cmd=lnkkid0944bid$1720dat1642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] XferLog and Incremental Backups

2006-03-06 Thread Craig Barratt
[EMAIL PROTECTED] writes:

 Hi -- I'm somewhat confused with how I thought incremental backups work and
 with what is shown in XferLog. Perhaps some kind soul can help me out? I'm
 running BackupPC 2.1.2 on FC4 against Linux and WinXP hosts.
 
 (1) When I look in XferLog, every directory is listed as create d on every
 run and, in fact, is created on every run. The documentation says that this
 only happens for new directories yet we are recreating the entire directory
 structure every run. Did I misunderstand, mess up, or what?

The entire directory structure is created on incrementals.
The reason is that it is difficult to know whether some
subdirectory might contain a new file, so the whole tree
is created.  The automatic merging of incrementals with
earlier fulls relies on the fact that the incremental
contains the full tree.

 (2) Files are being backed up in the incremental runs even though they have
 not changed. How can I work out why that is, or how do I stop this?

Do you mean no changed since the last incremental?  This depends
on your XferMethod.  If you are using smb or tar, then a file
with an mtime after the last full backup will be backed up on
every incremental, since each incremental is level 1 (ie: back to
the previous full).

 (3)  In the incremental runs, sometimes the entire contents of a directory
 is backed up if only 1 file has changed, sometimes all of the files in the
 directory including those that have not changed. This is probably related to
 (2). Again, .. help!

Can you show a precise example of the entries in the XferLOG file,
the actual file mtimes, and the time of the previous full.

 (4) I've read the documentation but I'm not sure I understand the subtlety
 between pool and same when referring to files that have not changed
 since the last full backups. Can someone please help me understand the
 difference perhaps with some examples?

Pool means the file was received and matched to an existing file in
the pool.  Same means that rsync determined the file is identical to
the file with the same path in the earlier full, and the file itself
was not transferred.

Craig


---
This SF.Net email is sponsored by xPML, a groundbreaking scripting language
that extends applications into web and mobile media. Attend the live webcast
and join the prime developer group breaking into this new coding territory!
http://sel.as-us.falkag.net/sel?cmd=lnkkid0944bid$1720dat1642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] backing up files that change during backup

2006-03-08 Thread Craig Barratt
Stephen Vaughan writes:

 Is there anyway to get backuppc to continue to backup regardless of
 errors? I had it backing up a 2gb db file, and during the transfer the
 file was modified and backuppc recognised this and aborted the backup.
 
 Remote[1]: send_files failed to open
 misc/backups/netchant-db.blobdump: No such file or directory
 misc/backups/netchant-db.dump: md4 doesn't match: will retry in phase
 1; file removed
 [ skipped 11751 lines ]
 Can't write 32840 bytes to socket
 [ skipped 3882 lines ]
 Read EOF:
 Tried again: got 0 bytes
 Child is aborting
 Parent read EOF from child: fatal error!
 Done: 14625 files, 1369509991 bytes
 Got fatal error during xfer (Child exited prematurely)
 Backup aborted (Child exited prematurely)

It is continuing: there are 11751 files mentioned before
the next error.

What is the value of $Conf{ClientTimeout}?  If it is 7200
please increase it by 20x.

Craig


---
This SF.Net email is sponsored by xPML, a groundbreaking scripting language
that extends applications into web and mobile media. Attend the live webcast
and join the prime developer group breaking into this new coding territory!
http://sel.as-us.falkag.net/sel?cmd=lnkkid0944bid$1720dat1642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BULK] RE: [BackupPC-users] Unexpected end of tar archive

2006-03-13 Thread Craig Barratt
Justin Best writes:

 Thanks for your help, Craig! I appreciate it so very much.
 
 I've filed a bug report with Samba, as you suggested.
 https://bugzilla.samba.org/show_bug.cgi?id=3592 
 
 Out of curiosity, though, it seems that you still have an open bug that
 references the same problem. In bug 563
 https://bugzilla.samba.org/show_bug.cgi?id=563
 you mentioned smbclient tar extract still only writes a maximum 2GB file;
 not sure why. Am I right in assuming that this is the same issue?  

That's a bug in the other direction (restoring large files).
Your bug is padding large locked files when creating an
archive).

 In any event, I'm still unfamiliar with how I should go about fixing this in
 my case. I assume that I need to download the source and modify it and
 recompile smbclient... but I haven't a clue on how to make padsize an
 unsigned int, as you suggest. I'm afraid that I know some ASP and PHP, and
 that's about it. Any direction about where to start with this would be
 appreciated.

You should download the latest source for samba, and edit
samba-3.X.X/source/client/clitar.c so that this line:

static int padit(char *buf, int bufsize, int padsize)

instead reads:

static int padit(char *buf, int bufsize, SMB_BIG_UINT padsize)

Then you should be able to run configure and make, then use
the resulting smbclient executable.

I haven't tested this fix, but it should be correct. 

Craig


---
This SF.Net email is sponsored by xPML, a groundbreaking scripting language
that extends applications into web and mobile media. Attend the live webcast
and join the prime developer group breaking into this new coding territory!
http://sel.as-us.falkag.net/sel?cmd=lnkkid0944bid$1720dat1642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Tar error

2006-03-20 Thread Craig Barratt
Vin cius Medina writes:
 I am having this problem with BackupPC:

 Running: /usr/bin/ssh -q -n -l USER* HOST* /bin/gtar -c -v -f - -C
 /arquivos/ --totals .

What happens when you run this:

su backuppc
/usr/bin/ssh -q -n -l USER* HOST* /bin/gtar -c -f - -C /arquivos/ --totals 
. | tar -tvf -

or even

/usr/bin/ssh -q -n -l USER* HOST* /bin/gtar -c -f - -C /arquivos/ --totals 
. | od -c | more

Craig


---
This SF.Net email is sponsored by xPML, a groundbreaking scripting language
that extends applications into web and mobile media. Attend the live webcast
and join the prime developer group breaking into this new coding territory!
http://sel.as-us.falkag.net/sel?cmd=lnkkid0944bid$1720dat1642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC_Link performance and locking out other jobs?

2006-03-26 Thread Craig Barratt
Steve Willoughby writes:

 I'm just getting started with BackupPC and up until the last machine I added,
 things were going fairly well.  I'm backing up to a 250Gb external USB drive,
 which is being incredibly slow.  It's taken it something like 4 days to 
 perform
 the BackupPC_dump operation, and about another 4 days now to run 
 BackupPC_link.
 
 I've already figured out why it's being so slow in the first place, and will
 be correcting it as soon as this finishes running, but the behaivor of 
 BackupPC in relation to this running job is a little unclear from what I've
 read in the docs.
 
 It looks like the BackupPC_link is preventing any other job from starting.
 I have other backups in the queue, but the status page says idle; nothing
 to do for them.  Is there something about BackupPC_link that can't allow
 any other backups or restores to start while it runs?
 
 Is there a way to look at the current server status and actually see 
 what's queued up and what's blocking each job from starting?

It's actually a pending BackupPC_nightly that is preventing
new backups from starting.  Each night BackupPC_nightly is
queued, and while it is queued no new backups will be started.
Since the existing BackupPC_link takes a long time to finish,
BackupPC_nightly is still pending, and no new backups will
start (unless you start one manually).

At most a single BackupPC_link or BackupPC_nightly can run.
BackupPC_dump can run in parallel with BackupPC_link, but
not BackupPC_nightly.

I am considering methods for the next release to reliably
decouple BackupPC_dump from BackupPC_nightly.

Craig


---
This SF.Net email is sponsored by xPML, a groundbreaking scripting language
that extends applications into web and mobile media. Attend the live webcast
and join the prime developer group breaking into this new coding territory!
http://sel.as-us.falkag.net/sel?cmd=lnkkid0944bid$1720dat1642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] why are full backups needed with BackupPC?

2006-03-26 Thread Craig Barratt
Laurent Mazet writes:

 To summarize, for a Windows host:

 - rsync over ssh doesn't work.

Yes, but I haven't tested it recently.

 - rsyncd transfers only diff but you need to connect with a clear password.

Rsyncd doesn't send a clear password over the network.  It uses
a digest-based challenge/response.

 - tar over ssh transfers every thing.

Yes, for a full.

 - smb transfers every thing and you need to connect with a clear password.

I'm not sure whether smb sends clear passwords over the network.

Craig


---
This SF.Net email is sponsored by xPML, a groundbreaking scripting language
that extends applications into web and mobile media. Attend the live webcast
and join the prime developer group breaking into this new coding territory!
http://sel.as-us.falkag.net/sel?cmd=lnkkid0944bid$1720dat1642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] error checking on DumpPreUserCmd

2006-03-27 Thread Craig Barratt
Matthias Bertschy writes:

 We are using Backuppc as our main backup solution for a dozen servers.
 One of our servers has a Perforce server, so we need to checkpoint its 
 database before backing up (it could also be a mysqldump).
 
 Some days ago, we detected a problem on the server preventing the 
 successful checkpointing of the Perforce database.
 Needless to say the backup was not usable, however, I would have loved 
 to see an error email and the backup marked as unsuccessful in Backuppc 
 host summary.
 
 Is it possible to send errors messages from the DumpPreUserCmd command 
 to Backuppc to stop the backup and inform users?

This is on the todo list, but not implemented yet.

Craig


---
This SF.Net email is sponsored by xPML, a groundbreaking scripting language
that extends applications into web and mobile media. Attend the live webcast
and join the prime developer group breaking into this new coding territory!
http://sel.as-us.falkag.net/sel?cmd=lnkkid0944bid$1720dat1642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] restarting BackupPC_link

2006-03-27 Thread Craig Barratt
Steve Willoughby writes:

 Can I assume that if BackupPC_link gets interrupted (say, by a system
 reboot), that re-running it will continue linking the backup files into
 the pool from where it left off, and not start over or get confused?
 
 Is there any special procedure for doing this correctly (or do I just let
 the automatic nightly and backup jobs run and it'll figure out what it 
 still needs to do)?

Interrupting BackupPC_link should be fine.  It will run next
time BackupPC_dump checks the host, even if there is nothing
to do.

Craig


---
This SF.Net email is sponsored by xPML, a groundbreaking scripting language
that extends applications into web and mobile media. Attend the live webcast
and join the prime developer group breaking into this new coding territory!
http://sel.as-us.falkag.net/sel?cmd=lnkkid0944bid$1720dat1642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] backupPC nightly backups spilling into the day...

2006-03-27 Thread Craig Barratt
Khaled Hussain writes:

 1. How can I clear all pending backups other than by going into each
host being backed up and stopping it?

The only other way is to kill BackupPC.  Not very graceful...

 2. At every wakeup, does backupPC reload the per-pc configs? Or do I
need to restart backupPC everytime I update a per-pc config?

No, you don't need to restart BackupPC.

Craig


---
This SF.Net email is sponsored by xPML, a groundbreaking scripting language
that extends applications into web and mobile media. Attend the live webcast
and join the prime developer group breaking into this new coding territory!
http://sel.as-us.falkag.net/sel?cmd=lnkkid0944bid$1720dat1642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Integer overflow in octal number

2006-03-28 Thread Craig Barratt
Andy writes:

 I have been using BackupPC 2.1.1 (from Debian) to backup a number of 
 linux hosts over RSync for some time without any difficulties.
 
 But recently I added a Windows 2000 server with several shares to be 
 backed up over SMB and am getting the following error in the Xfer log:
 
tarExtract: Integer overflow in octal number at
/usr/share/backuppc/bin/BackupPC_tarExtract line 224.
 
 So far these have only been recorded for full backups, but perhaps 
 incremental backups are less likely to report the error just because 
 less files are transferred.

This should be fixed in BackupPC-2.1.2pl1.

Craig


---
This SF.Net email is sponsored by xPML, a groundbreaking scripting language
that extends applications into web and mobile media. Attend the live webcast
and join the prime developer group breaking into this new coding territory!
http://sel.as-us.falkag.net/sel?cmd=lnkkid0944bid$1720dat1642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] can't do backups (of Windows shares with problematic permissions)

2006-03-29 Thread Craig Barratt
Tomasz Chmielewski writes:

 Tomasz Chmielewski wrote:
  Tomasz Chmielewski wrote:
  Tomasz Chmielewski wrote:
 
  (...)
 
  Unfortunately, it breaks in the same way:
 
  2006-03-28 13:24:20 full backup started for share S$
  2006-03-28 13:33:09 Got fatal error during xfer (Didn't get entire 
  file. size=460206, nread=131040)
  2006-03-28 13:33:14 Backup aborted (Didn't get entire file. 
  size=460206, nread=131040)
  2006-03-28 13:33:14 Saved partial dump 0
 
  It seems to be related to this thread:
 
  http://lists.samba.org/archive/linux-cifs-client/2006-March/001230.html
  http://lists.samba.org/archive/linux-cifs-client/2006-March/001232.html
  http://lists.samba.org/archive/linux-cifs-client/2006-March/001233.html
 
  In short: it's possible to create longer filenames on the Windows 
  machine than on a Linux machine.
 
  And this is when BackupPC (or smbclient?) seems to break.
  
  More info:
  
  smbclient 192.168.10.4\\C\$ -U user%pass -E -N -d 1 -c tarmode\ full 
  -Tc -
  
  creates the tar file just fine.
  
  When uncompressing the archive (tar -xf), tar complains about file names 
  being too long, but doesn't break.
  
  So the problem lies somewhere in BackupPC?
 
 If someone wants to reproduce, I can send him a package with problematic 
 filenames, which cause the backup job to abort.
 
 It is enough if the file is in C:\1\... on a Windows drive, the backup 
 job will fail right away when it approaches it.

Please send me the tar file (off list) that gives the problems.

Craig


---
This SF.Net email is sponsored by xPML, a groundbreaking scripting language
that extends applications into web and mobile media. Attend the live webcast
and join the prime developer group breaking into this new coding territory!
http://sel.as-us.falkag.net/sel?cmd=lnkkid0944bid$1720dat1642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] rsyncd on Windows can't handle utf8 filenames?

2006-03-29 Thread Craig Barratt
Tomasz Chmielewski writes:

 Does this mean that in 3.x it will be safe to use rsyncd for Windows 
 hosts (now special utf8 characters are translated to ?)?

Yes.

Craig


---
This SF.Net email is sponsored by xPML, a groundbreaking scripting language
that extends applications into web and mobile media. Attend the live webcast
and join the prime developer group breaking into this new coding territory!
http://sel.as-us.falkag.net/sel?cmd=lnkkid0944bid$1720dat1642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Single Host Blackout/Wake Up

2006-04-12 Thread Craig Barratt
Tomasz Chmielewski writes:

 Les Stott wrote:
  Hi,
  
  I have a client with 50 or so pcs, all doing rsync backups throughout 
  the day to a backuppc server. Works Great.
  
  We also have a linux box running a Cyrus Imap Store and we were running 
  rsync backups too, just getting the cyrus directories. This was working 
  great until a month or so ago when the rsync's started failed with 
  Child Exited Prematurely.
 
 (...)
 
  2. Alternatively is there a way to schedule a backup of this  host via 
  cron? and just disable regular backups?
 
 Put it into config.pl for the host:
 
 $Conf{FullPeriod} = -1;
 
 And then you can start cronjobs with:
 
 su -l backuppc -c /srv/backuppc/bin/BackupPC_dump -v -f old-backups 
 /dev/null

You shouldn't use this form to start backups, except
for debugging.  That's because BackupPC itself isn't
aware that BackupPC_dump is running unless it starts
it itself.  It could be running BackupPC_nightly.
Also, the linking step doesn't happen if you run
BackupPC_dump nightly.

You should use cron to run BackupPC_serverMesg to start
the backup.  See the list archive for the syntax.

Craig


---
This SF.Net email is sponsored by xPML, a groundbreaking scripting language
that extends applications into web and mobile media. Attend the live webcast
and join the prime developer group breaking into this new coding territory!
http://sel.as-us.falkag.net/sel?cmd=lnkkid0944bid$1720dat1642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] $Conf{DumpPostUserCmd} output to /dev/null

2006-04-28 Thread Craig Barratt
Rein writes:

 This is likely due to me not speaking Perl, but what's the best way to 
 achieve subj?
 
 Here's how it works for me now -- is there a better way?
 
 I'm using an expect script to shut the computer down after the backup 
 completes, but I don't want the output of that script to end up in the 
 transfer log, because it contains the password.
 
 For some reason expect's own fork ... disconnect routine doesn't work 
 when BackupPC executes the script as DumpPostUserCmd.
 
 Setting
 $Conf{DumpPostUserCmd}= 'shutd.sh $host  /dev/null';
 
 didn't work also -- the ' /dev/null' part is passed as an argument to the 
 'shutd.sh' script.
 
 So right now I'm using a wrapper script that takes the hostname as an 
 argument and calls the original expect script, directing it's output to 
 /dev/null.

To improve security, BackupPC doesn't use a shell to execute commands
- it forks and execs the commands directly.  So any shell constructs,
like redirection, don't work.  Your approach of using a wrapper shell
script it the right one.

Craig


---
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid0709bid3057dat1642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] HardLinkMax not working?

2006-05-04 Thread Craig Barratt
David Mansfield writes:

 I have a file in the cpool with 32000 links.  HardLinkMax is set to 
 31999.  Let's assume this is exactly correct, because one link is 
 reserved for the 'pool'.
 
 However, last night during the second full backup ever for this host, I 
 got the following kind of error 24081 times:
 
 Unable to link 
 /data/pc/terranova/0/f%2f/fexport/fsmbshare/fpcdrive/fdwiech/fjesse/fcvswork/fpwstat/fCVS/fRoot
  
 to 
 /data/pc/terranova/new//f%2f/fexport/fsmbshare/fpcdrive/fdwiech/fjesse/fcvswork/fpwstat/fCVS/fRoot
 
 The file is a 'cvsroot' file, 'sandbox/module/CVS/Root', and since the 
 same exact file exists in every subdirectory of every checked out 
 project for every user on our system, this files gets a TON of links.
 
 Does the above mean that the HardLinkMax isn't working for a subsequent 
 full backup of an existing host?
 
 Note: the link that is failing is NOT from the pool to a backup 
 directory, but from one backup directory to another.

Rsync directly links from one backup to the next.

This is a bug in 2.1.2 which is fixed in cvs.

Craig


---
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid0709bid3057dat1642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Editing the web interface

2006-05-18 Thread Craig Barratt
Lee A. Connell writes:

 Which files contains the left navbar, I want to add some more links to
 the navigation.

There's a config variable for this: $Conf{CgiNavBarLinks}.

Craig


---
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid0709bid3057dat1642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Need tech review of backuppc chapter

2006-05-29 Thread Craig Barratt
Curtis Preston writes:

 My name is W. Curtis Preston, and I'm the author of Unix Backup 
 Recovery from O'Reilly.  The first edition sold over 40,000 copies, and
 we're looking to update it.  One of the things we did was add a chapter
 on BackupPC.
 
 The chapter was written by Don Duck Harper, and we're looking for two
 volunteers to do a tech review for us.
 
 It's a 10-page page-turner, and you'd have a week or so to review it.
 You'd get an autographed copy of the book and be mentioned in the
 acknowledgments.
 
 Any takers?

Sure.  I'm happy to be one of the volunteers.

Craig


___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Patching

2006-06-05 Thread Craig Barratt
Lowe, Bryan writes:

 I am trying to install BackupPC for the first time. I have done all the
 preliminary work, downloading and installing everything I need to run
 BackupPC on my Solaris 9 box.  The BackupPC documentation says to next
 untar the BackpuPC-2.1.2, then apply any patch that's available.  This
 is where I get stuck.  I follow the documentation to a T, it says:
 
 --
 
 tar xvf BackupPC-2.1.2.tar.gz
 
 cd BackupPC-2.1.2
 
 patch p0  ../BackupPC-2.1.2pl1.diff

Start over, and use -p0 instead of p0:

patch -p0  ../BackupPC-2.1.2pl1.diff

Craig


___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] backuppc bug tracker?

2006-06-07 Thread Craig Barratt
Ralf Gross writes:

 I was looking at the backuppc home page and the sourceforge project page
 for the bug tracker that is backuppc is using. I couldn't find any info on
 how to file a bug, what is the recommended way to do this?

BackupPC doesn't use the SF bug tracker.  Just post to the devel
or user mail list.

Several people have seen the empty output problem.  I haven't
been able to recreate it.

There is a new version of File::RsyncP that is close to release
that you could try.  I can email it to you if you want.

Craig


___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] archive to tape problem

2006-06-07 Thread Craig Barratt
Mark Coetser writes:

 I am having a little trouble getting this running, I have read the docs etc
 
 Here is my config.pl for the archive host
 
 # Set this client's XferMethod to archive to make it an archive host:
 $Conf{XferMethod} = 'archive';
 
 # The path on the local file system where archives will be written:
 #$Conf{ArchiveDest} = '/dev/st0';
 
 $Conf{ArchiveClientCmd} = '$tarCreatePath -h $host -n $backupnumber -s * 
 /dev/st0'
 
 And I get the following error
 
 Archive failed (  compPath fileExt splitSize outLoc parFile share)
 
 I have tried
 
 $Conf{ArchiveClientCmd} = '$Installdir/bin/BackupPC_archiveHost'
 . ' $tarCreatePath $splitpath $parpath $host $backupnumber'
 . ' $compression $compext $splitsize $archiveloc $parfile *';
 
 But still the above doesn't work, does anyone have any advice or an example
 of their setup for archive to tape ?

You should set the archive location in the CGI interface to
/dev/st0. That will write to tape on the BackupPC server.
The default ArchiveClientCmd should work correctly.

Since BackupPC doesn't use a shell to run external commands, redirection
() won't work.

Craig


___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Host Backup Summary

2006-06-07 Thread Craig Barratt
Bill Hudacek writes:

 One small note, if I may...I'm not sure about others' thoughts on this 
 matter, but I shared your OMG moment, Travis :-)
 
 When I discovered BackupPC, I was very pleased with it.  I still am, 
 lest anyone think otherwise!
 
 However, I was very dismayed to find the word incremental being used 
 to describe the backups performed between full backups.  it's pervasive 
 throughout the docs, FAQs, and this news group.
 
 As a (former, in a past life) UNIX systems manager for an international 
 corporation's data center in the U.S., I dealt with these kinds of backups:
 
 1.Full
 2.Differential
 3.Incremental
 
 A differential the day after a full would backup only those files that 
 changed that day.  A differential the day after that would back up all 
 files that had changed since the full backup - in other words, the 
 contents of the first day's differential backup was included in the 
 second day's backups.  This was not as bad as it sounds, as a file 
 deleted before the backups on day 2 would still appear on day 1's 
 differential backup.
 
 Incrementals, however, meant only files changed since the last backup 
 of any kind (full, differential, incremental) would be backed up.
 
 Thus, incrementals were by nature very constant in terms of execution 
 time and storage media consumed.  When you have hundreds of big-iron 
 servers with local disk, SAN space, and NAS space, the difference 
 between incrementals and differentials can be huge (in terms of runing 
 time and space required).
 
 I'm not even speaking as a UNIX guy here, this operational approach was 
 used for PCs and VAX VMS boxen too.
 
 I would have been happiest with a three-tiered backup model in backuppc, 
 as my use of an on-line backup server means having 1 full, the latest 
 differential (say, from full + 3 weeks), and three incrementals to 
 restore is not an inconvenience at all.  Back when we used tape sets 
 (RAID-5 arrays of tape drives), we had to find all the media and request 
 that they be brought back on-site.  Thus, we did differentials once per 
 week so we did not have dozens or hundreds of tapes to restore.  At most 
 we needed full backup set + one differential set + max # of incremental 
 sets since the last differential.
 
 However, having said that - with my BackupPC environment, instead of 
 running a full backup once a month, and having differentials weekly, 
 with incrementals daily, I simply run fulls weekly and thus the 
 differentials that backuppc calls incrementals do not ever approach 
 the cost of a full backup (which I would consider not just an 
 inconvenience but a serious problem).

In the spirit of dump(1), incrementals in BackupPC are all
level 1 (ie: refer back to the previous level 0, or full).

A yet-to-be-implemented feature is multi-level incrementals.
The goal would be to allow the level of each incremental to
be configured.  This would give you the flexibility to support
(in your terminology) full, differential and incremental backups.

Craig


___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] backuppc bug tracker?

2006-06-07 Thread Craig Barratt
David Rees writes:

 On 6/7/06, Craig Barratt [EMAIL PROTECTED] wrote:
  There is a new version of File::RsyncP that is close to release
  that you could try.  I can email it to you if you want.
 
 Out of curiosity, what's new in File::RsyncP?

Support for hardlinks (also requires BackupPC 3.x or CVS)
and a protocol update to v28.

Craig


___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] how check if files are really compressed

2006-06-07 Thread Craig Barratt
Ambrose Li writes:

 On 05/06/06, Víctor A. Rodríguez [EMAIL PROTECTED] wrote:
  - copy fadsutil.vbs to a new locatoin, try to bunzip2 it and if susccessfull
  you'll have a fadsutil.vbs with the same length an content that the original
  one
 
 This method won't work. Backuppc seems to add a header of something to
 the beginning of files; I don't think gzip or bzip2 will be able to
 understand backuppc's compressed files.

That's right, the files aren't in gzip or bzip2 format.  The script
bin/BackupPC_zcat can be used to uncompress the files.

Back to the original question: just look in the PC summary and you
will see the compress level for each backup.  If it is non-zero
then the backup files are compressed.

The other way to verify is to look at the cpool (compressed pool)
vs pool (uncompressed pool) usage.

Craig

___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Strane errors with smb

2006-06-10 Thread Craig Barratt
Michael Zehrer writes:

 I'm doing a smb type backup and I recieve tons of errors like this in
 the Xfer.log:
 
 tarExtract: [EMAIL PROTECTED]
 tarExtract: i/rd3s452C[2l0['E$].;OW/U+g'\D-U^YA4aePeMD05\8`hEODp#=BE_*lYnJ
 tarExtract: VQ;N*A@/IBG,ZKBJpbJMIWBESI#hQ3Xi)d7Vp3u1bH11%BggPpTk0Epth`9D
 tarExtract: Y?JQK'T]G@6/8.[EMAIL PROTECTED](`3_(;J-]BD]
 tarExtract: f%C[m,gsed`t*hl5rKb=-9piS1AG\L8:dO_]nfUO*UV))bkBk+8#Fud#K$bGt
 tarExtract: $Z#b[LQ: checksum error at 
 
 in result most filenames are destroyed which looks like this:
 
 http://zepan.org/pics/backuppc-screen.jpg
 
 What's wrong?

What version are you running?  2.1.2pl1 has this fix:

  - Fixed bug in BackupPC_tarExtract for files 8GB in size whose
lengths are multiples of 256.  Reported by Jamie Myers and
Marko Tukiainen, who both helped debugging the problem.

Craig


___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] rsyncd based only on full's

2006-06-14 Thread Craig Barratt
Paul writes:

 I'm doing my backup using the rsyncd service on the PC's.
 I'm also quite new to backuppc.
 
 I was wandering why the rsyncd method uses as base the last full,
 and is not taking into account the last incremental.
 My experience with rsync is mostly Unix, and there I can mirror
 a tree using only minimal network traffic and CPU-usage.
 But of course, the resulting tree is not linked into some pool.
 
 Is it just implementation issues, easier to base on a full, much more
 difficult to use the combination of full + incrementals?
 Would it be easier when I fill in the incrementals?
 Or or there more fundamental reasons?

It's just something that isn't fully implemented yet.
Most of the pieces are in place.

However, it will take more work to support only incrementals
since that would require the original full and all incrementals
to be kept forever, and an increasing number of incrementals
would have to be merged together.  That's why doing a periodic
full is necessary.  The solution would be to fill in each
incremental as the oldest backup is removed.  That said, doing
periodic fulls is good practice since that's the only time
the file contents are really checked.

 Also, there is no '--delete' option used: an incremental is
 now unaware of deleted files.  Is that because File::RsyncP has
 no provision for that?

The delete operation is not really meaningful since BackupPC
is creating a new directory tree for each backup.  With rsync
it does keep track of deleted files in incrementals.

 Les M. also mentioned that each backup method has some quirks that
 are only resolved when doing a full.
 Are there any issues that rsyncd runs into when doing only
 incrementals? (besides the notion of removed files)

The quirks are for tar and smb, not rsync.  Removed and renamed
files work with rsync incrementals.  The significant missing pieces
with rsync are hardlinks and ACLs.  Hardlinks will be supported in
the next release.  ACLs will be supported with BackupPCd.

Craig


___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Funny ssh Problems

2006-06-14 Thread Craig Barratt
Nils Breunese (Lemonbit Internet) writes:

 Travis Fraser wrote:
 
  What version of rsync are you using? Later versions need --devices
  changed to -D in $Conf{RsyncArgs}.
 
 I guess I need to change this for $Conf{RsyncRestoreArgs}  as well?

Yes.

Craig


___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] deferred restore

2006-06-14 Thread Craig Barratt
Travis writes:

 I was trying to restore a folder so first I got the ok message and the
 request was sent. However, the process didn't start until 2 hrs later. 
 
 Today again, for another restore job, the request was sent at 7am but
 the 
 The log shows:
 7  success 6/14 08:53  0.2 22  0.0 0
 0
 
 In both cases, there was nothing showing on the status page before I did
 the restore. Nothing should be running coz the backup/incr was done few
 hrs prior to it. 

Was BackupPC_nightly running?  Look at the server LOG file.

Craig


___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Blackout settings

2006-06-14 Thread Craig Barratt
Thomas Maguire writes:

 I want most systems on my network to backup overnight but I have
 several notebooks that are only attached to the network during
 business hours. I reviewed the blackout settings in config.pl and
 wanted to know if I was interpreting them correctly.

 It seems  that if I use the following settings the system will
 learn which systems are not connected after working hours and
 back them up during the day when they become available.
 
 WakeupSchedule = [1..24]# wakeup every hour
 
 BlackoutBadPingLimit = 3 # default setting
 BlackoutGoodCnt = 7# default setting
 
 BlackoutPeriods =  [ # set blackout 8am to 5:30pm Monday 
 through Friday
   {
 hourBegin = 8.0,
 hourEnd   = 17.5,
 weekDays = [1, 2, 3, 4, 5],
   }
 ]
 
 
 Is this an accurate assessment?

Yes, this is correct.  Putting these settings in the main config
file will do the right thing.  Clients that are reliably on the
network will be subject to blackout (after a week).  Clients
that are not reliably on the network will be backed up whenever
they are available.

Craig


___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


[BackupPC-users] BackupPC 2.1.2 pl2 (patch level2) released

2006-06-18 Thread Craig Barratt
I just released a new patch for BackupPC 2.1.2.  This changes
the --devices rsync option to -D in config.pl to fix the
fileListReceive failed bug with recent rsync versions.

This patch includes the earlier fixes in the prior pl1 patch.

The patches can be applied to a fresh 2.1.2 release
by downloading the single file BackupPC-2.1.2pl2.diff
from http://backuppc.sourceforge.net and following the
instructions in that file.  BackupPC should then be
re-installed from the fresh, patched, release.

The patched version is reported as 2.1.2pl2.

I've attached the cumulative list of issues fixed in the
patch.

Craig

ChangeLog:

 - In conf/config.pl, changed --devices to -D in $Conf{RsyncArgs}
   and $Conf{RsyncRestoreArgs} to fix fileListReceive failed and
   Can't open .../f%2f for empty output errors with rsync 2.6.7+.
   Fix proposed by Justin Pessa and Vincent Ho, and confirmed by
   Dan Niles.

 - Added patch from Michael (mna.news) to ignore file is unchanged
   message from tar 1.15.x during incremental backups.

 - Fixed creation of .rsrc directories in bin/BackupPC_tarExtract
   when used with xtar on MacOS.  Reported by Samuel Bancal and
   Matthew Radey, who helped with debugging.

 - Fixed bug in BackupPC_tarExtract for files 8GB in size whose
   lengths are multiples of 256.  Reported by Jamie Myers and
   Marko Tukiainen, who both helped debugging the problem.

 - Fixed bug in lib/BackupPC/Xfer/RsyncFileIO.pm that caused
   incorrected deleted attributes to be set in directories
   where one of the files had an rsync phase 1 retry during
   an incremental.  Reported by Tony Nelson.


___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Incremental problem

2006-06-22 Thread Craig Barratt
Shohan writes:

 I want to keep 6 last incremental that will be done automaticly. so my
 configuration is as below
 My configuration:
 
 $Conf{FullKeepCnt} = 1;
 
 $Conf{FullKeepCntMin} = 1;
 $Conf{FullAgeMax} = 30;
 
 $Conf{IncrKeepCnt} = 6;
 
 $Conf{IncrKeepCntMin} = 1;
 $Conf{IncrAgeMax} = 30;
 
 But the problem is i can see only 1 incrementals in the web interface.
 But i dont see any more incrementals..
 Whats wrong?

Are you saying it does extra fulls instead of incrementals?
Have you given it enough time for more backups to run?

What are $Conf{IncrPeriod} and $Conf{FullPeriod|} set to?
Do you have a per-PC config.pl that overrides any settings?

Craig

Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Incremental problem

2006-06-23 Thread Craig Barratt
shohan writes:

 I dont want to make auto full backup so
 $Conf{FullPeriod}= -1
 $Conf{IncrPeriod}= 0.97

That means disable all backups, both full and incremental.

Craig

 On 6/23/06, Craig Barratt [EMAIL PROTECTED] wrote:
 
  Shohan writes:
 
   I want to keep 6 last incremental that will be done automaticly. so my
   configuration is as below
   My configuration:
  
   $Conf{FullKeepCnt} = 1;
  
   $Conf{FullKeepCntMin} = 1;
   $Conf{FullAgeMax} = 30;
  
   $Conf{IncrKeepCnt} = 6;
  
   $Conf{IncrKeepCntMin} = 1;
   $Conf{IncrAgeMax} = 30;
  
   But the problem is i can see only 1 incrementals in the web interface.
   But i dont see any more incrementals..
   Whats wrong?
 
  Are you saying it does extra fulls instead of incrementals?
  Have you given it enough time for more backups to run?
 
  What are $Conf{IncrPeriod} and $Conf{FullPeriod|} set to?
  Do you have a per-PC config.pl that overrides any settings?
 
  Craig

Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] (no subject)

2006-06-24 Thread Craig Barratt
SAJChurchey writes:

 I'm trying to restore a full backup to a newly installed server after we
 re-installed the OS. I'm trying to use rsync method. Whenever I try to
 restore files I get the following errors on the entire pool.
 
 Remote[2]: skipping non-regular file filename
 
 What could be causing these errors? None of the files in question are
 symlinks. So I don't understand what I need to do to restore this.
 
 Any help would  be greatly appreciated.

Hmmm.  It appears that the file mode is not being interpreted
correctly by rsync.

First thing to try: replace --devices with -D in both $Conf{RsyncArgs}
and $Conf{RsyncRestoreArgs}.

Failing that, try just a smaller subdirectory and you could try
setting $Conf{XferLogLevel} to around 5 or so to get more detailed
information.

If you are still unsuccesful, try tar (either changing the XferMethod
or manually just by running BackupPC_tarCreate and pipe it into
something like ssh -l root HOST tar -C /somewhere -xpf -).

Craig

Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Got remote protocol 757955594

2006-07-10 Thread Craig Barratt
Kai Grunau writes:

 I'm using  backuppc version 2.1.0pl1 on  RedHat Enterprise 3 server
 
 before last weekend I had no problem but since then no BackupPC
 is running on  2 Solaris machines (no problem with the other linux 
 computer).
 
 I found in the Xferlog file following message :
 ---
 Running: /usr/bin/ssh -q -x -l root hathorjan /usr/local/bin/rsync --server
 --sender --numeric-ids --perms --owner --group --devices --links --times
 --block-size=2048 --recursive --exclude /lost+found/ . 
 /var/opt/SUNWmsgsr/store/partition/primary/
 Xfer PIDs are now 661
 Got remote protocol 757955594
 Fatal error (bad version):

The decimal value 757955594 is the first four bytes of the unexpected
text or error from the remote machine.  In hex it is 2D2D7C0A or
--|\r.  Does your .cshrc or ssh login sequence emit this string?

This text should appear in ascii after bad version.  I'm surprised
to see the rest of that message is empty.

Craig


-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


[BackupPC-users] BackupPC 3.0.0beta0 released

2006-07-13 Thread Craig Barratt
I have released BackupPC 3.0.0beta0 on SF at:

http://backuppc.sourceforge.net/

A new version 0.62 of File::RsyncP that is needed for rsync
hardlink support has also been released on SF and CPAN.

3.0.0beta0 has some substantial new features compared to 2.1.2.
New features include:

* Added configuration and host CGI editor.

* Added rsync hardlink support.  Requires latest version of
  File::RsyncP (0.62).

* Decoupled BackupPC_dump from BackupPC_nightly by making
  asynchronous file linking/delete robust to race conditions.
  Now only BackupPC_nightly and BackupPC_link are mutually
  exclusive so only one runs at a time, and BackupPC_dump and
  BackupPC_restore can run anytime.

* Added support for multi-level incrementals.  In the style of dump(1),
  the level of each incremental can be specified.  Each incremental
  backups up everything since the most recent backup of a lower level
  (fulls are always level 0).  Previous behavior was all incrementals
  were level 1, meaning they backed up everything since the last full
  (level 0).  Default configuration is all incrementals are level 1.

* Server file names are now in utf8 and optional conversion
  to/from client name charsets can be configured.  All CGI pages
  now use the utf8 charset.

* Added RSS support from Rich Duzenbury.

* Added optional checking of exit status of Dump/Restore/Archive Pre/Post
  UserCmd.

* For new installations configure.pl places all the configuration
  files below /etc/BackupPC, bringing it more in line with the
  File System Hierarchy Standard (FHS).

See the ChangeLog for full details.

Depending upon the reported bugs and issues there could be
additional patches and beta releases prior to the offical
3.0.0 release.

Enjoy!
Craig


-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Unable to link errors and fix??

2006-07-13 Thread Craig Barratt
John writes:

 On 7/11/06, John Villalovos [EMAIL PROTECTED] wrote:
  Searching through the email archives it seems that it is a known issue
  that BackupPC will exceed the hard link count specified as a maximum.
  And then I got the impression that there is a yet as unreleased
  version which fixes this.
 
  Is it possible to get this fix?  Because I have hit 31999 hardlinks on
  my system and it is failing :(
 
 Any ideas on this?  Does anyone know if the tip of the CVS tree is
 safe to use, because I see that it is supposed to have the fix.

I just released 3.0.0beta0, which should fix the hardlink
limit issue with rsync.  Currently CVS head is identical
to the 3.0.0beta0 release.

Craig


-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] strange glitches with some linux hosts

2006-07-14 Thread Craig Barratt
Ilya Rubinchik writes:

 What is this??
 
 2006-07-15 09:05:25 Backup failed on aaa.com (fileListReceive failed)
 2006-07-15 09:05:27 aaa.com: overflow: flags=0x63 l1=111 l2=1819243374, 
 lastname=dev/log
 2006-07-15 09:05:27 aaa.com: overflow: flags=0x65 l1=0 l2=-251658240, 
 lastname=dev/log
 2006-07-15 09:05:27 aaa.com: overflow: flags=0x4c l1=0 l2=295716024, 
 lastname=dev/log
 
 2006-07-15 09:00:05 Started full backup on billing (pid=31196, share=/)
 2006-07-15 09:00:11 Backup failed on billing (fileListReceive failed)
 2006-07-15 09:00:11 billing: overflow: flags=0x65 l1=119 l2=1702127986, 
 lastname=var/spool/postfix/private/tlsmgr

Try replacing --devices with -D in $Conf{RsyncArgs} and
$Conf{RsyncRestoreArgs}.

Craig


-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] using the host lookup

2006-07-20 Thread Craig Barratt
Frank writes:

 I have just installed Backup PC and have three hosts. When I access the UI to 
 view the activity of the host I get the following message:
 Error: Only privileged users can view information about host store1.
 I am not sure why I can not view this host info. I set up the host name in 
 two locations:
 /etc/hosts and /u/BackupPC/conf/hosts
 Also,
 When I first log into the server and then access the UI I am not
 seeing any summery of backup activity?? when looking at the online
 info I see screen shots that show hosts in various stages of back ups.

You don't have admin privileges setup.  First, are you prompted to
login prior to accessing the cgi interface?  If not, look in the
docs.  Next, make sure you user name is in $Conf{CgiAdminUsers}.

Craig

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys -- and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] can't exec /usr/local/BackupPC-2.1.2/bin/BackupPC_trashClean for trashClean

2006-07-22 Thread Craig Barratt
Nilesh writes:

 I am continuously getting the following entry in log file and it increase.
 
 --
 2006-07-21 17:09:40 can't exec
 /usr/local/BackupPC-2.1.2/bin/BackupPC_trashClean
 for  trashClean

Does that path exist?  Is that file executable?  Does the
BackupPC user have permission to read/execute it?  What
happens if you run it manually as the BackupPC user:

su backuppc
/usr/local/BackupPC-2.1.2/bin/BackupPC_trashClean

What is the first line of BackupPC_trashClean, eg:

#!/usr/bin/perl

It should be the path to perl.  Does an executable perl
exist at that path?

Craig

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys -- and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Understanding rsync

2006-07-24 Thread Craig Barratt
Nicholas writes:

 You must remember that by default BackupPC runs as user backuppc with
 limited access.  You could use sudo over SSH for local backups.
 
 i.e. $Conf{RsyncClientCmd} = '/usr/bin/sudo $rsyncPath $argList+';

...plus drop the + from $argList:

$Conf{RsyncClientCmd} = '/usr/bin/sudo $rsyncPath $argList';

since there is no shell that needs escaping of arguments.

Craig

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys -- and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] No full backup with version 3.0.0 beta?

2006-07-31 Thread Craig Barratt
Matt writes:

 I am using the rsyncd transfer method and would like to have each new backup
 be base to the most recent incremental, thus completely avoiding full
 backup after the first initial backup.
 
 Is this now possible with version 3.0.0 beta?   Maybe $Conf{IncrLevels}
 can be use to this end but I don't  see how.

Unfortunately that won't work yet, even in 3.0.0.  In theory you
could set $Conf{IncrLevels} to a long, incrementing, sequence,
and set $Conf{FullPeriod} to be large too.

However, for each new backup, the full and all the incrementals
have to be merged together to get the most recent backup filled.
That's used as a reference for the next incremental.  The problem
is that takes more and more time for each backup (ie: after a week
every directory on the client requires 7 directory reads on the
server; after a month it is 30).  Also, no backups can be expired.

What's required is one more feature: filling in incrementals as
older backups are expired.  That way old backups can be deleted
but a filled backup is kept so that more recent backups can
be reconstructed.

Note that with rsync a full backup (after the first) doesn't involve
a lot more network traffic than an incremental.  In 3.0.0 the most
recent (merged) backup is used as a reference (rather than the last
full in 2.x) so files changed after the last full but before the
most recent incremental won't be transferred again.

The other reason to do a full is that the actual file contents are
checked.  Incrementals just check meta data.

Bottom line: you still should do periodic fulls.

Craig

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys -- and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Problems with File-RsyncP-0.62

2006-08-01 Thread Craig Barratt
ken writes:

 Anyone having problems with File-RsyncP-0.62?  It won't install with cpan.
 Linux: 2.6.17-1.2142_FC4
 Perl: perl-5.8.6-24

There is a new version File-RsyncP-0.64 on SF and cpan that
fixes this problem.

Craig

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys -- and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] I can't start Backuppc beta 3 on Ubuntu

2006-08-05 Thread Craig Barratt
Vincent writes:

 I try to run backuppc beta 3 on my ubuntu server.
 I installed and run configure.pl script. There were no errors.
 But when i try the init script /etc/init.d/backuppc start i have the 
 following permanent error.
 
 Starting backuppc: No language setting
 BackupPC::Lib-new failed
 
 Is someone can help me please ?

Looks like it can't read the config file correctly.  Most likely
either a path or permissions problem.

Is this a new installation or upgrade?

Where is BackupPC installed?  Confirm that /etc/init.d/backuppc
has the correct paths and user.  On a new installation the config
file will be in /etc/BackupPC/config.pl.  Make sure that file exists
and has the right permissions.

Failing that, email me offlist these three files: /etc/init.d/backuppc,
lib/BackupPC/Lib.pm and /etc/BackupPC/config.pl.

Craig

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys -- and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Install problem: can't open bin/BackupPC for reading

2006-08-07 Thread Craig Barratt
Fred McCann writes:

 I'm trying to install BackupPC-2.1.2 on FreeBSD 6. I'm entering the  
 BackupPC-2.1.2 directory and running as root:
 
 ./configure.pl
 
 I get through all the questions and then this happens:
 
 Ok, we're about to:
 
- install the binaries, lib and docs in /usr/local/www/data/BackupPC,
- create the data directory /backup/backuppc,
- create/update the config.pl file /backup/backuppc/conf,
- optionally install the cgi-bin interface.
 
 -- Do you want to continue? [y]?
 Created /backup/backuppc/.
 Created /backup/backuppc/conf
 Created /backup/backuppc/pool
 Created /backup/backuppc/cpool
 Created /backup/backuppc/pc
 Created /backup/backuppc/trash
 Created /backup/backuppc/log
 Installing binaries in /usr/local/www/data/BackupPC/bin
 can't open bin/BackupPC for reading
 
 
 As far as I can tell, bin/BackupPC exists prior to running the script  
 and is gone afterwards.
 
 I've made sure all the prerequisites are in place. I'm not sure what  
 to do from here. Any advice would be greatly appreciated.

The only thing I can think of is that you are installing BackupPC
in the same place you unpacked BackupPC-2.1.2.tgz.  configure.pl
unlinks the destination of the copy if it already exists.  That's
the only place it does an unlink().

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Errors with 3.0beta1

2006-08-11 Thread Craig Barratt
Tony Molloy writes:

 I've just upgraded backuppc to 3.0beta1. The upgrade went OK and I did a 
 small test backup which went OK.  I'm having several problems with the 
 Cgi interface though.
 
 When I look at the Log File I sometimes get the following:
 
 Software error:
 
 Undefined subroutine BackupPC::CGI::View::action called \
 at /var/www/cgi-bin/BackupPC_Admin line 109.
 
 and sometimes it works!!
 
 When I look at the Old Log Files I get
 
 Software error:
 
 Can't locate object method sortedPCLogFiles via package BackupPC::Lib
at /opt/backuppc/lib/BackupPC/CGI/LOGlist.pm line 59.
 
 When I select a host to view and again look at he log file I get the same 
 error.
 
 Software error:
 
 Undefined subroutine BackupPC::CGI::View::action called 
 at /var/www/cgi-bin/BackupPC_Admin line 109.

Hmmm.  Are you running mod_perl?  If so, I recommend you restart
apache since it is possibly running some pre-upgrade code.  The
clue is that it sometimes works - that's consistent with several
of the httpd processes having old code and others new code.

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] 3.0.0beta1 problem.

2006-08-11 Thread Craig Barratt
Jonathan writes:

 I have updated my system to the 3.0.0beta1 and wen i try to start the
 backuppc server i always get this error:

 2006-08-10 13:38:09 Another BackupPC is running (pid 3151);
 quitting...

BackupPC reads the PID file (eg: $TOPDIR/log/BackupPC.pid)
and checks if that process exists.  That file is meant to
be removed when BackupPC exits.

In your case another process (httpd) happens to have the same pid,
likely because BackupPC exited non-cleanly and the process numbers
have since wrapped.  So, yes, BackupPC isn't really running but
BackupPC thinks it is.

Just remove the pid file.

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Windows/Linux speed differences (windows is faster?)

2006-08-11 Thread Craig Barratt
Cameron Dale writes:

 I'm backing up several different machines on my local network to my
 debian-based server using BackupPC. I'm using rsyncd for all of this,
 2.6.8 on the Linux machines, and cygwin-rsyncd-2.6.2_0.zip on the
 Windows machines. One of the machines is even dual booted, and the
 same data is backed up sometimes by Windows, sometimes by Linux.
 
 I'm seeing differences in the speed and number of files transferred
 for incremental backups on Linux compared to Windows. Looking at the
 XferLog files, it seems that the Windows machines are only
 transferring the changed files (the log contains create d for all
 the directories, and create for a few new files), whereas the Linux
 machines transfer all the files (the log contains similar create d
 and create entries, as well as same entries for ALL the unchanged
 files).
 
 This is most apparent on the dual-boot machine as some incrementals
 are small and fast, whereas others are large and slow. Here is the
 backup listing for that machine. Note that backups 1, 3, 7, and 10
 were all when it was booted Linux, the others were all Windows.
 
  Totals Existing Files New Files
 Backup# Type  #Files  Size/MB MB/sec  #Files  Size/MB #Files  Size/MB
 0 full49052239.8  3.21225 8.2 50722231.7
 1 incr49052239.8  10.37   49062239.8  391 0.1
 2 incr7   5.7 0.0818  1.8 42  3.9
 3 incr49052239.5  3.9449042239.2  3   0.4
 4 incr8   5.8 0.082   2.0 10  3.8
 5 incr8   5.9 0.082   2.0 8   3.8
 6 incr8   5.9 0.092   2.0 10  3.9
 7 incr49052239.7  12.65   49002235.9  9   3.8
 8 incr8   5.9 0.097   5.6 2   0.4
 9 incr34  24.60.3526  20.214  4.4
 10incr47302140.5  18.61   47252136.6  11  3.9
 11incr34  24.70.3728  20.611  4.0
 
 I suspect this is somehow related to checksum caching, but I have
 enabled the checksum-seed option globally and I think all the rsync's
 I'm using should have it available. Is there something else in the
 cygwin-rsyncd-2.6.2_0.zip version of rsync that is speeding up the
 Windows backups? Can I somehow get that on my Linux machines too?

Yes, the entire set of files is being transferred on an
incremental with a linux boot.

That means some meta data (eg: uid, gid, mtime, size) is being
delivered differently to rsync on windows vs linux.

It is quite possible that the uid and gid are different when you boot
windows vs linux.  It might be related to file time stamps.  What type
of file system is this?  If it is FAT then you are likely a victim of
the DST problem.  Google rsync FAT DST.

I recommend doing some manual rsyncs and seeing what meta data
you get after the transer with windows vs linux.

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Windows/Linux speed differences (windows is faster?)

2006-08-11 Thread Craig Barratt
Cameron Dale writes:

 Craig Barratt said the following on 11/08/2006 1:47 AM:
  Yes, the entire set of files is being transferred on an
  incremental with a linux boot.
 
 But why is this happening? What is the difference between Windows and
 Linux that would cause this?
 
  That means some meta data (eg: uid, gid, mtime, size) is being
  delivered differently to rsync on windows vs linux.
  
  It is quite possible that the uid and gid are different when you boot
  windows vs linux.  
 
 As I said in my post, this problem is easiest to see on the dual-boot
 machine, however it also is apparent on other machines.
 
 Here is the backup listing for a Windows machine, where you can see that
 the number of files transferred after the first backup is quite small:
 
Totals Existing Files   New Files
 Backup# Type  #Files  Size/MB MB/sec  #Files  Size/MB #Files  Size/MB
 0 full800 676.7   2.2387  47.4795 629.3
 1 incr5   34.90.942   9.2 21  25.7
 2 incr5   34.91.202   3.0 8   31.9
 3 incr5   34.90.856   34.90   0.0
 4 incr11  35.40.964   19.317  16.0
 5 incr11  35.40.4712  35.40   0.0
 6 incr11  35.40.9112  35.40   0.0
 7 incr11  35.41.4112  35.40   0.0

Yes, this case looks normal.

 Here is the backup listing for a Linux machine, where the incrementals
 are getting bigger and bigger each time:
 
Totals Existing Files   New Files
 Backup# Type  #Files  Size/MB MB/sec  #Files  Size/MB #Files  Size/MB
 0 full23505.3 0.35397 0.1 22355.3
 1 full26335.7 0.3823515.2 576 0.5
 2 incr14982   187.1   0.8210703   36.15832151.5
 3 incr14982   187.6   1.2614944   159.7   47  27.9
 4 incr14984   187.9   1.2414943   161.7   53  26.3
 5 incr15705   194.0   0.9615650   166.6   370 27.6
 6 incr15705   194.9   1.0515666   166.3   48  28.6
 7 incr15705   195.4   1.0715670   167.2   43  28.3
 8 incr16207   197.4   1.3716060   167.0   268 30.4
 9 incr16211   197.8   1.5816172   168.9   57  29.0
 10incr16240   201.8   1.5416199   169.9   51  31.8
 11incr16240   202.2   1.5816202   172.0   48  30.2
 12incr16241   188.5   1.4716200   168.6   51  19.9
 13incr16245   189.5   1.3916201   170.7   56  18.9
 14incr16251   190.6   1.2716203   170.7   64  19.9
 15incr16253   191.7   1.1716213   171.0   53  20.7

What doesn't make sense is that the two fulls have a lot less files
than the incremental.  I suspect you setup a small test for backups
#0 and #1, then set it to backup a lot more prior to incremental #2.
Therefore each incremental is backing up a lot of files not in the
full.  You should start a full backup and then see what happens
with the next incrementals.

  It might be related to file time stamps.  What type
  of file system is this?  If it is FAT then you are likely a victim of
  the DST problem.  Google rsync FAT DST.
 
 Nope, it's NTFS on Windows, ReiserFS on Linux.

My original claim still stands: on the dual boot system I
suspect the uid/gid or mtime is not returned consistently
when your machine is booted on windows vs linux.  Therefore,
if the last full was from windows, then a linux incremental
will backup every file again (and vica versa).  With rsync
not a lot of data will be transferred, but it will take a
lot more time.  I suggested you manually run rsync in each
case to see.

Bottom line: you are seeing two different issues here.

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Upgrading to 3.0 -- RESOLVED

2006-08-11 Thread Craig Barratt
Chris Stone writes:

 Had a bit of time to spend on this and I did get the upgrade installed
 without having to hack the scripts at all. I DID have to upgrade the
 Encode package to 2.18 and that took care of it all and the install
 completed and backuppc started successfully.

Good detective work!

Yes, for some reason your older version of Encode made the
check for BackupPC::FileZIO fail.  What is your version of
perl?

I should add a check to configure.pl for this, at a minimum
to give a better error message.

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Directory mysteriosly excluded

2006-08-11 Thread Craig Barratt
Benjamin Kudria writes:

 On Friday, August 11 2006 1:56, Loyd Darby wrote:
  Try this(note the extra comma:
  $Conf{BackupFilesOnly} = ['/home/bkudria',
  '/usr/local/vpopmail/blueboxtech.com',];
 
  I am guessing that it isn't recognizing the end of the string and is
  re-using what is in the buffer.
 
  You might try including an a --exclude of /tmp in the per host config
  just to see what it does.
 
 
 Changed the per-host config to look like this:
 
 $Conf{BackupFilesOnly} = 
 ['/home/bkudria', '/usr/local/vpopmail/blueboxtech.com',];
 $Conf{BackupFilesExclude} = '/tmp';
 
 This yields an Xferlog of:
 
 Running: /usr/local/bin/sudo /usr/local/bin/rsync --server --sender 
 --numeric-ids --perms --owner --group -D --links --times --block-size=2048 
 --recursive --include=/home --include=/home/bkudria --include=/usr 
 --include=/usr/local --include=/usr/local/vpopmail 
 --include=/usr/local/vpopmail/blueboxtech.com --exclude=/\\\* 
 --exclude=/home/\\\* --exclude=/usr/\\\* --exclude=/usr/local/\\\* 
 --exclude=/usr/local/vpopmail/\\\* --exclude=/tmp --ignore-times . /
 Xfer PIDs are now 33447
 Got remote protocol 29
 Negotiated protocol version 26
 Sent include: /home
 Sent include: /home/bkudria
 Sent include: /usr
 Sent include: /usr/local
 Sent include: /usr/local/vpopmail
 Sent include: /usr/local/vpopmail/blueboxtech.com
 Sent exclude: /*
 Sent exclude: /home/*
 Sent exclude: /usr/*
 Sent exclude: /usr/local/*
 Sent exclude: /usr/local/vpopmail/*
 Sent exclude: /tmp
 Xfer PIDs are now 33447,34233
 [ skipped 2212 lines ]
 Done: 1767 files, 11011497 bytes

This list of includes/excludes is correct.  You should remove
the extra comma at the end of $Conf{BackupFilesOnly} though.

Just to state the obvious, does the directory
/usr/local/vpopmail/blueboxtech.com exist?
That's the only directory you have asked to
back up.

Is this a local file system or nfs mounted?  In the latter case, you
will have a permissions problem without the right export options since
you are running as root.  Secondly, if you are using an automounter
for /usr/local/vpopmail/blueboxtech.com then rysnc won't trigger
the automounter since it just opens the directory /usr/local/vpopmail
to look for blueboxtech.com.

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Avoid doing backups on specific weekday

2006-08-11 Thread Craig Barratt
Nicolai Nordahl Rasmussen writes:


 - I thought of using the $Conf{BlackoutPeriods} to simply define the
   whole sunday as a blackout period, but I'm afraid the incremental
   backup would then just be pushed to run monday instead?

This should work correctly.  Yes, Monday will do an incremental
after the blackout ends, but the schedule of the full backup
($Conf{FullPeriod}) will ensure that the full backup will still
occur on Friday or Saturday.  It might shift in time by a few
hours since $Conf{IncrPeriod} needs to elapse prior to the full.

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] 3.0.0 beta 1 reports pool size to be zero

2006-08-12 Thread Craig Barratt
Ambrose writes:

 I am using 3.0.0beta1 and I am seeing that in the Status screen, almost
 everything is reported as zero (as copied below). I am wondering if others
 are seeing this or if I have done something wrong (maybe a permission
 problem?), or if this is just a case of something being not compatible with
 backuppc. This is an Intel Mac running system 10.4.

Just to confirm: this was an upgrade, and it used to work?

For an upgrade, by default, all the various config and status file
locations should be the same.  In particular, $LogDir/status.pl
(should be the same as $TopDir/log/status.pl in 2.x) is where the
pool file stats are stored.

However, if you did an install to a new location, then FHS
will put the status.pl file in /var/log/BackupPC/status.pl,
which will intitially be empty.

So the possible explanations are:

 - status.pl is in a new location and it will be updated overnight
   when BackupPC_nightly runs.

 - or: BackupPC_nightly isn't running: look in the main log file
   to see if it runs each night.

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] 3.0.0 beta 1 reports pool size to be zero

2006-08-13 Thread Craig Barratt
Ambrose writes:

 On 12/08/06, Craig Barratt [EMAIL PROTECTED] wrote:
  Just to confirm: this was an upgrade, and it used to work?
 
 This is a new installation. However, I did at first install it to run
 as user daemon before managing to figure out how to create
 a separate backuppc user on Apple's very weird unix, so I did
 have to chown several files and directories after re-installing.
 
  So the possible explanations are:
 
   - status.pl is in a new location and it will be updated overnight
 when BackupPC_nightly runs.
 
   - or: BackupPC_nightly isn't running: look in the main log file
 to see if it runs each night.
 
 It seems that BackupPC_nightly did run last night. The first
 few lines in the LOG file read

BackupPC_nightly didn't find any files in the pool.
Can you look in the pool and confirm that is the case?
If it isn't empty, then there is some permissions, path or
installation problem that prevents BackupPC_nightly from
seeing the files.  If it is empty, either there aren't
any backups completed, or the linking to the pool fails,
which could be explained by $TOPDIR/pc being on a different
file system than $TOPDIR/cpool, or a permissions problem.

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Windows/Linux speed differences (windows is faster?)

2006-08-14 Thread Craig Barratt
Cameron writes:

 I'm thinking of changing perms, owner, group, and maybe times all to
 no-OPTIONs. However, I'm concerned about how this will affect the
 program as I don't understand how File::RsyncP works, as it says in
 the comments. Can I go ahead and do this? Do I need times off or just
 the owner/group/perms off?

File::RsyncP doesn't understand the -no options.
Just try removing:

'--perms',
'--owner',
'--group',

I have't tested that before, so please tell me whether
it works.

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] ssh passphrase for backuppc user at startup?

2006-08-16 Thread Craig Barratt
Nathan writes:

 5.  Confirmed that backups succeed when I do su - backuppc and start
 BackupPC manually with the -d option.
 
 However, backups fail (fileListReceive failed) whenever I reboot and
 just let the provided init.d script start backuppc, I presume because
 there's never a chance to plug in the backuppc user's passphrase.

The fileListReceive failed error seems to imply it gets far enough
with ssh to negotiate the protocol etc.  So perhaps the clients
on which it fails are suffering from the rsync argument problem;
ie: try replacing '--devices' with '-D' in $Conf{RsyncArgs} and
$Conf{RsyncRestoreArgs}.

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Windows/Linux speed differences (windows is faster?)

2006-08-16 Thread Craig Barratt
Cameron writes:

 On 8/14/06, Craig Barratt [EMAIL PROTECTED] wrote:
  Cameron writes:
 
   I'm thinking of changing perms, owner, group, and maybe times all to
   no-OPTIONs. However, I'm concerned about how this will affect the
   program as I don't understand how File::RsyncP works, as it says in
   the comments. Can I go ahead and do this? Do I need times off or just
   the owner/group/perms off?
 
  File::RsyncP doesn't understand the -no options.
  Just try removing:
 
  '--perms',
  '--owner',
  '--group',
 
  I have't tested that before, so please tell me whether
  it works.
 
 I commented out those options, and did a full backup with Windows (so
 I guess commenting the options works fine). I got entries like this
 (lots of same and pool, a few create for empty file, and all the
 usuall create d for directories):
   create   644   0/0   0 Filename
   same 644   0/02249 Filename
   pool 644   0/0 1117184 Filename
   create d 755   0/0   0 Dirname
 
 So it looks like I dropped the user/group, but the file permissions
 are still there. Then I did an incremental in Windows, which was the
 same as before (short and sweet). Then a long incremental in Linux
 gave entries like this (lots of same, lots of create for empty file,
 the usual create d):
   create   555   0/0   0 Filename
   same 555   0/0   41812 Filename
   create d 555   0/0   0 Directory
 
 This is exactly as before, so removing the options from rsync really
 had no effect on the linux backup. Just for completeness, here are the
 backup size summaries, full (17) and first incr (18) are on Windows,
 other (19) is on linux:
 17 full47312140.6  10.10   47312140.6 
  292 0.1
18  incr0   0.0 0.000   0.0 3   0.0
 19incr47312140.6  16.59   47312140.6  5   0.0
 
 I feel like I'm closer, but I don't understand why rsync seems to be
 preserving the permissions on Windows even when I've told it not to.
 Any ideas? Do I need to empty my pool in order for the rsync option
 changes to be effective? Could this be related to how I'm mounting the
 Windows filesystem under linux (with options
 ro,users,auto,group,umask=0222)?

Yes, the umask will ensure that the file mode is different
in linux vs windows.

In any case, File::RsyncP ignores the --perms flag.

There are two places in the installed File/RsyncP.pm file (look
in perl's path from perl -V, or otherwise modify the file in a
File::RsyncP distribution and reinstall) where you will see this
line:

 $f-{mode}  == $attr-{mode}

You should replace both lines with something like this:

 (!$rs-{rsyncOpts}{perms} || $f-{mode} == $attr-{mode})

I'll fix this in the next release of File::RsyncP.

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Problem backing up a machine in the DMZ

2006-08-16 Thread Craig Barratt
Nicolai Nordahl Rasmussen writes:

 I only have the local dns in the resolv.conf file
 andromeda:~# cat /etc/resolv.conf
 search corena.dk
 nameserver 10.5.0.3
 
 I've added an entry for the server in the hosts file:
 
 andromeda:~# cat /etc/hosts
 127.0.0.1   localhost.localdomain   localhost
 10.5.0.7andromeda.corena.dk andromeda
 192.168.0.5 merkur.corena.dkmerkur
 
 But it keeps claiming: NO PING RESPONSE
 
 Is there a way of debugging what is actually going on? - Or maybe
 to increase the timeout value or something like that. I have a
 very hard time figuring out what it is that BackupPC does wrong
 - since I'm able to login, su to the backuppc user and execute the
 ping -c 1 merkur command _EVERY_ time...and I get instant response
 every time...

 Can I change the ping parameter in the per-pc config for that
 particular host, to try and mangle a bit around with it?

Rodrigo's suggestions are good ones.  Sounds like ping is not
reliable.

FYI, you can see exactly what BackupPC is doing by running:

su backuppc
BackupPC_dump -v -f merkur

and look at the output.  It should show you the exact commands and
steps it is taking. 

If it is just ping that is unreliable (unlikely) you can disable
the ping command by setting $Conf{PingCmd} to '/bin/true'.  But
it is likely that the next command (eg: ssh or rsync) will fail
if the network is unreliable.

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] ssh passphrase for backuppc user at startup?

2006-08-16 Thread Craig Barratt
Nathan Barham writes:

 Thanks for the reply.  I took the Last error is fileListReceive
 failed from the Backup Summary page for the host in question.  I should
 have posted from the actual error log, which has this ...
 
 SNIP ...
 
 Fatal error (bad version): Permission denied
 (publickey,keyboard-interactive).

 fileListReceive() failed
 
 SNIP ...
 
 I'm not sure what bad version refers to, but Permission denied seems
 clear enough.  Backups succeed when I use a passwordless key, and also
 succeed when I use a key with a password but manually start the daemon
 as the backuppc user.  So, I think the problem is with the ssh
 authentication, not the rsync options.

bad version means BackupPC was expecting the rsync protocol
version, and instead it got this string from the remote client:

Permission denied (publickey,keyboard-interactive)

So you are right: BackupPC doesn't have the right credentials
for ssh.

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC_dump coredumps

2006-08-17 Thread Craig Barratt
Marc Prewitt writes:

 We've been trying to troubleshoot one of our bigger dumps which keeps
 failing silently after a few hours.  It's actually not that big in terms
 of files (9221) but the files are rather large.  They are btrieve-like
 database files.
 
 I tried running the BackupPC_dump command from the command-line and see
 a failure after a few minutes.  The BackupPC_dump process coredumps with
 a segfault.  We were originally running BackupPC_dump under perl 5.6.1
 with File::RsyncP 0.54 but as a test I tried running with 5.8.2 and
 File::RsyncP 0.64.  Still get the exact same coredump as with 5.6.1.
 I've included the backtrace below.
 
 Any thoughts or suggestions on how to debug this further would be
 appreciated.  Currently, we're using backuppc 2.1.2p2. Rsync on the host
 and client are both 2.6.6.

Can you set $Conf{XferLogLevel} to 5 and email me (offlist) the
output from BackupPC_dump -v -f stats?

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Interupted backups, partial files and Rsync

2006-08-17 Thread Craig Barratt
David writes:

 I am having a problem using BackupPC over a slowish VPN link with large 
 files. If BackupPC is aborted (mainly due to signal=ALRM) when 
 transferring a large file, BackupPC appears to delete the partially 
 transfered file, preventing Rsync from restarting where it dropped off.

Rodrigo mentioned $Conf{ClientTimeout} and you said you increased it.
But it sounds like by not enough.  What did you set it to?

 I have set the --partial in $Conf{RsyncArgs}, but when the backup is 
 aborted, the partially transfered file is deleted.

File::RsyncP (on the BackupPC end) will ignore the --partial option.

 The problem is that each time the same file times out, and as the 
 partially transfered file is deleted, next backup the same thing 
 happens, and the backup never completes.
 
 How do I prevent BackupPC from deleting the partially transfered file?

You can't.  BackupPC doesn't have a notion of a partially transferred
file.

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Error in cgi config editor for Email Settings - 3.0.0Beta1

2006-08-17 Thread Craig Barratt
Les Stott writes:

 When i edit Email Settings, Dest Domain or other and save, it blitzes 
 the config.pl file, and BackupPC cannot start.
 
 its all to do with the Email headers section.
 
 Previously I had this in the config file.
 
 $Conf{EMailHeaders} = EOF;
 MIME-Version: 1.0
 Content-Type: text/plain; charset=iso-8859-1
 EOF
 
 even if i don't modify EmailHeaders, but modify something else and save, 
 the EmailHeaders in config.pl ends up looking like
 
 $Conf{EMailHeaders} = 'MIME-Version: 1.0
 Content-Type: text/plain; charset=iso-8859-1
 ';
 ';
 
 Consequently this pops up in the cgi
 
 Error: Unable to read config.pl or language strings!!
 
 and this when you try to restart..
 
 [EMAIL PROTECTED] conf.d]# service backuppc restart
 Shutting down BackupPC:[  OK  ]
 Starting BackupPC: Bareword found where operator expected at 
 /etc/BackupPC/config.pl line 1872, near # by spaces, in 
 $Conf{CgiAdminUsers}. If you don't
   (Might be a runaway multi-line '' string starting on line 1857)
 (Missing operator before t?)
 Couldn't execute /etc/BackupPC/config.pl: syntax error at 
 /etc/BackupPC/config.pl line 1872, near # by spaces, in 
 $Conf{CgiAdminUsers}. If you don't want 
 BackupPC::Lib-new failed
[FAILED]
 
 it has happened on an upgraded FC4, and a fresh install on RHEL3.

Hmmm.  Not good.  I tried this on my setup (pasting in the old EOF
delimited EMailHeaders) and it is fine.

Can you email me (offlist) the complete config.pl and tell me which
parameter you changed to create the problem?

BTW, the old config is saved to config.old.

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Doing backups every 2 hours

2006-08-17 Thread Craig Barratt
Vinicius writes:

 I want to do backups every 2 hours, on a single host, using
 backuppc.In order to do this, without sucess, I included on the PC's
 config.pl these settings:
 
 $Conf{BlackoutGoodCnt} = -1; #in order to not use blackout
 $Conf{WakeupSchedule} = [10,12,14,16,18,20,22]; #to make it wakeup and
 do it's job
 $Conf{IncrPeriod} = 0.08; #to do an incremental in every two hours.
 $Conf{FullPeriod} = 0.97; # to do a full backup daily
 $Conf{FullKeepCnt} = 2; #to keep two full daily backups
 $Conf{IncrKeepCnt} = 8; #to keep some hourly backups from yesterday
 
 When te wakeup hour comes it ignores without an single error.
 
 Someone can share some ideas on what am i doing wrong? Already looked
 on the list archive and oum the internet.

Run:

su backuppc
BackupPC_dump -v -i HOST 

and look at the output.

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Interupted backups, partial files and Rsync

2006-08-18 Thread Craig Barratt
David writes:

 Les Mikesell wrote:
  On Thu, 2006-08-17 at 03:23, David Simpson wrote:



  $Conf{ClientTimeout} = 3600*6;  # 6 Hours!!
  
 
  Can you crank it much higher and try to get your first run
  to complete over a weekend?  If you are using rsync over ssh
  and the data can be compressed, it might also help to add the
  -C option  after $sshPath in your $Conf{RsyncClientCmd}.

 Yes, I can crank it up, but it doesn't really solve the problem. This is 
 not the first run of the backup, it has been working well both locally 
 and remotely for several months. Its just that recently someone placed a 
 large new file in one of the directories which is backed up, and that 
 caused it to fail.

You could try replacing these lines of code in lib/BackupPC/Xfer/RsyncFileIO.pm:

if ( defined($fio-{rxFile}) ) {
unlink($fio-{rxOutFile});
$fio-log(finish: removing in-process file $fio-{rxFile}{name});
}

with:

if ( defined($fio-{rxFile}) ) {
$fio-{rxFile}{size} = $fio-{rxSize};
$fio-attribSet($fio-{rxFile});
$fio-log(finish: keeping in-process file $fio-{rxFile}{name} size 
$fio-{rxSize});
}

I'm not sure this will work correctly, but it is worth a try.
Of course, this will probably void your BackupPC warranty.

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] multi level incrementals

2006-08-19 Thread Craig Barratt
Toby Johnson writes:

 I'm a bit confused with the description of multi-level incrementals in 
 3.0.0b1. Does this mean in plain English that incrementals will only 
 backup files that have changed since the previous incremental instead of 
 since the previous full backup?

Yes.

 Do these incrementals still appear filled when browsing on the 
 website? And I still get the benefits of pooling? In other words, are 
 they exactly like other backups in all respects aside from the fact that 
 they transfer less data?

Yes.  Incrementals will still be filled or merged (now with possibly
multiple other incrementals).  Pooling still works.

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] backup recovery

2006-08-20 Thread Craig Barratt
David Koski writes:

 I made some changes to config.pl to reduce the number of backups:
 
 $Conf{FullKeepCnt} = [2,0,0,2,0,0,2];
 
 ..whereas before I had [2,2,2,2,2,2,2];
 
 Now I need to recover files from a year ago and they are actually still there 
 but do not show up in the GUI. The GUI only goes back to 7/29 but for some 
 reason the files still exist on the backuppc server. When I try to execute:
 
 /usr/share/backuppc/bin/BackupPC_tarCreate -t -n 126 -h 
 tiikeri /home/users/dkoski/src
 
 (all on one line) I get the error bad backup number 126 for host tiikeri. 
 How do I restore the files?

Sorry about the delay in replying.

What version of BackuppC are you running?  There was a bug
related to this fixed in 2.1.2.

If you look in the per-client log files do you see messages about
removing of the backups?  Does it match what is currently in the
backups file?

In 3.0.0 there is a script to recreate a missing or damaged
backups file.  It relies on some additional information saved
in 3.0.0.  However, it also has a mode (using the -l option)
that allows it to do the best it can to recover pre-3.0.0
backups too.

So, if you are running something prior to 2.1.2 then this
is a known bug (fixed since 2.1.2), and I recommend upgrading
to 3.0.0 and running:

su backuppc
BackupPC_fixupBackupSummary -l HOST

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] user backuppc has no home directory?

2006-08-21 Thread Craig Barratt
Rodrigo Real writes:

 Yes, but your home path has some strange spaces, it should be
 something like this:
 
 backuppc:x:101:407:added by portage for backuppc:/var/lib/backuppc:/usr/bin/sh
 
 sh usually is in /bin, but I am not sure about Gentoo. If you still
 have trouble on this, check your sh path.

You can also keep backuppc with no shell, and use the -s option
to su:

su -s /bin/bash backuppc

Depending upon your configuration you might also need the -l option.

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Renaming PCs that have been backed up

2006-08-24 Thread Craig Barratt
Les Stott writes:

 David Wimsey wrote:

  Let me start off with sorry if this has been answered and I didn't find 
  it in the archives, I'm betting someone else has already asked, but here 
  it is anyway.
 
  I had an instance of BackupPC running for nearly a year and had to stop 
  for various silly reasons.  Since I stopped, all the end user machine 
  there were being backed up have changed names.  There were something 
  like _username and are now just username.  What I'm wondering is if 
  I can update the configs, and rename the various host directories and 
  have BackupPC continue to work with the machines without any big hassles?

 yes definitely. Rename the hosts in the hosts file, then rename the top 
 level pc directories. Make sure permissions are still set properly.
 
 Try doing this for one of your hosts, then reload or restart the 
 backuppc configuration and you should see the change. its worked for me.

Yes, renaming things in this way is the best approach.

An alternative is to keep everything the way it is and use
$Conf{ClientNameAlias} in each per-PC config file to point
at the new host names.  But then you forever have the legacy
names in BackupPC.

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] why smb errors when $Conf{XferMethod} = 'rsync' ?

2006-08-30 Thread Craig Barratt
naroza writes:

 I'm a little confused. For those that don't know me, I'm running a Gentoo
 instalation, trying to get backuppc v2.1.2 running so that I might backup some
 Linux clients via rsync.
 
 I have the following in /etc/backuppc/:
 
 -rwxr-xr-x  1 root root 64379 Aug 23 02:35 config.pl
 -rw-r--r--  1 root root  2264 Aug 16 15:36 hosts
 
 
 I have edited a certain line in /etc/backuppc/config.pl:
 
 $Conf{XferMethod} = 'rsync';
 
 
 Why, then, when I try to start backuppc, does backuppc start looking for an 
 smb
 client program?:
 
 # /etc/init.d/backuppc start
  * Starting BackupPC ...
 2006-08-30 00:31:30 $Conf{SmbClientPath} = '/usr/bin/smbclient' is not a valid
 executable program  [ !! ]
 
 
 ...weird, right? Anybody know how to stop this from happening? Am I 
 overlooking
 some other configuration option?

When BackupPC starts it checks that all the important external programs
in the config file exist and are executable.

When configure.pl runs it should set them up correctly or leave them
empty.  It appears the Gentoo installation doesn't do those checks.

You should simply set $Conf{SmbClientPath} to undef, or an empty
string, or as Les suggested some other executable program (but
don't forget to change it back if you ever use smb).

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] rsync restore fatal error

2006-08-30 Thread Craig Barratt
R G writes:

 I am further testing my Backuppc setup. I just attempted to do a restore of
 I temporary directory that I created the night before in the /etc directory.
 I deleted the directory and attempted to do a restore. I get the following
 error in the log:
 
 
 Running: /usr/bin/sudo rsyncPath --server --numeric-ids --perms
 --owner --group --devices --links --times --block-size=2048 --relative
 --ignore-times --recursive --checksum-seed=32761 . /etc/gallery2/
 Xfer PIDs are now 12412
 Got remote protocol 1936941392
 Fatal error (bad version): Password:
 ^

Instead of getting the protocol version, it got the
text Password: instead.

So the problem is that sudo is prompting for a password.

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] FTP Save ?

2006-09-01 Thread Craig Barratt
Nils writes:

 Trasher wrote:
 
  I use BackupPC for my local network backup and all is running very  
  fine :)
  I'd like now to backup a website, which I can only acceed by ftp  
  protocol.
 
  A simple wget command does actually the trick, but I'd prefer using  
  backuppc capabilities. Is it possible to use wget as backup command  
  in this case ?
 
 BackupPC only supports smb, tar and rsync transfer methods. You could  
 use a cronjob on of your machines that is already being backed up to  
 backup your site using wget and that way get that backup into BackupPC.

Or you could use $Conf{DumpPreUserCmd} to run wget and XferMethod
tar or something similar to pickup the tree created by wget.  Note
that there is no difference between incremental or full backups in
this case since every file will be re-written by wget.

Long term I was planning to add wget as an XferMethod to
support ftp and http, but it is pretty far down the list.
This would allow things like the admin interfaces of
network HW to be backed up, so if the box dies you can
at least browse the most recent configuration settings.
(No notion of restore of course.)  Plus you could see when
pages change between backups, either manually using the
history feature or as part of the tripwire feature on the
proposed feature list.

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Restore of symlinks

2006-09-01 Thread Craig Barratt
Sturla writes:

 I restored to the same directory as the backup was taken from and the
 symlinks LOOK ok when I stat them, but they just won't work.
 If I delete the symlink and create it again with the info from stat it
 works, but it's kinda tedious doing this on every symlink.
 I tried to write a script to do it for me, but I'll have to work some more
 on the selection-part on what to do with regards to what the first char in
 the symlink is (wether its / or .. or just a regular letter).
 
 Any pointers would be great...

You have to include some specific examples in your email.
Show us a symlink that doesn't work, and one that does.
Are just absolute symlinks or just relative symlinks
broken?

Do ls -l LINKNAME | od -c to see if there are any extraneous
characters in the symlink.  Also do ls -lL LINKNAME to see
if the target file exists.

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC refuses to find File::RsyncP

2006-09-07 Thread Craig Barratt
Johan Str m writes:

 I just got BackupPC setup and working, (what a nice piece of  
 software :D), when I had a disk crash on the system drive of the  
 system running backuppc... The disk where backuppc has its top-dir is  
 however not the same and everything there is fine.
 
 After a fresh reinstall of FreeBSD 6.1 on a new disk, I tried to get  
 BackupPC working again. I installed it using the old config.pl (when  
 asked in configure.pl), and I had all perl plugins installed..
 However.. As soon as any of the clients are to be dumped, all I get  
 is this in the log:
 
 File::RsyncP module doesn't exist

Strange.  Your tests do indeed confirm that File::RsyncP does
exist and perl can load it.

The test is in lib/BackupPC/Xfer/Rsync.pm.  All it does it checks
whether use File::RsyncP; works or not.  It shouldn't matter
whether the verion is 0.52 or 0.64.

Is it possible the BackupPC_dump is running a different (older)
version of perl?  Look at the first line of BackupPC_dump, eg:

/usr/bin/perl

and check whether that version can load File::RsyncP:

/usr/bin/perl -e 'use File::RsyncP;'

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] tarExtract: Integer overflow in octal number

2006-09-14 Thread Craig Barratt
Jean-Michel Beuken writes:

 is it serious ?
 
 Running: /usr/local/samba/bin/smbclient 192.168.23.11\\PC-Boumal 
 -U boumal -E -N -d 1 -c tarmode\ full -TcrX - ntuser.dat.LOG \\Local\ 
 Settings\\Temp\\\* \\Local\ Settings\\Temporary\ Internet\ Files\\\* 
 full backup started for share PC-Boumal
 Xfer PIDs are now 27714,27713
 tarExtract: Integer overflow in octal number at 
 /usr/local/BackupPC3/bin/BackupPC_tarExtract line 225.

That line of code is:

$mtime= oct $mtime;

so there is a file that has an mtime that is not encoded
correctly in octal by smbclient (at least according to perl)
when it creates the tar file.

The error isn't fatal, but it does mean one of your 35145
files has the wrong mtime.

I'd be curious to figure out where this bogus mtime comes
from.  Perhaps on your solaris 10 system mtime is 64 bits,
and the encoding of a bogus value from smb fails in some
way.  What version of smbclient is this and what is the
target file system?

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Authentication failing for Incremental backups (3.0b1)

2006-09-29 Thread Craig Barratt
Kevin,

Other users have noted that authentication for incrementals doesn't work
in smbclient version 3.0.23.

I don't know if there is a fix or an understanding of the root cause.
Down-reving smbclient is one solution.

Craig

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys -- and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] how to rebuild pc/hostname/backups file

2006-10-01 Thread Craig Barratt
Stian Jordet writes:

 tir, 13,.09.2005 kl. 22.38 -0700, skrev Craig Barratt:
  Carl Wilhelm Soderstrom writes:
  
   I'm running out of disk space on my backup server, and it's run out of 
   space
   on a couple of occasions. when it does this, some hosts 'forget' all 
   their old
   backups -- those backups no longer appear in the pc/hostname/backups 
   file.
   
   I know the backups.old file has a copy of the last known-good backups 
   file;
   but what happens if that one is bad as well? (Machine runs out of space,
   space is freed, a running backup is shut down, a bad backup file is 
   written
   and clobbers the old backups.old file).
   
   I think this has been discussed on the list in the past; but can someone
   remind me how to rebuild a backups file based on the actual directories
   present?
   
   This is something that probably should be added to the docs.
  
  I need to write a script that does this for you.  It could
  also be done manually.
  
  Some information (eg: number of files backed up, total size)
  takes more work to recreate since it requires the backed-up
  tree to be traversed.  But that information is only used for
  stats/display, so it is not critical.
  
  Some information (eg: full vs incr) probably needs to extracted
  from the per-PC LOG file.
  
  Some information (eg: existing count and size of files in pool) is
  not easy to reconstruct, but again it only used for stats/display.
  
  Anyhow, this is something I should work on.  Do you need a solution
  quickly?
 
 I just blew a fuse trying to start up too many computers at once, and
 the battery on my UPS obviously isn't what it used to be anymore, so my
 server died. It might seem like it was doing some backup, because the
 backups file for three of my machines were empty. backups.old as well. 
 
 So my question is, was this script ever written? I see that this is over
 a year ago now. If not, how can I easiest recover?
 
 To make things more complicated, a backup started of one computer before
 I found out that the backup files had been truncated. I had one backup 0
 from sometime last summer that now was overwritten. It doesn't really
 matter that the old backup is lost, except that I have about 25 backups
 for that machine, and 0 is the newest one. Should I just delete the 0
 directory before I recreate the backups file? (When I know how to :P)
 
 And sorry for bringing this old thread back to life.

In BackupPC 3.0.0 an additional copy of the per-backup information is
stored in each backup directory, which readily allows the backups file
to be re-created.  Also, the writing of these files is now verified
by writing to a temp file, re-reading and checking the contents,
before replacing the original.

BackupPC 3.0.0 includes a recovery script BackupPC_fixupBackupSummary
that re-creates the backups file.  It includes the -l option for
legacy mode that looks through the log files to recreate backup
files from 2.x.

Unfortunately BackupPC_fixupBackupSummary won't run without changes
on BackupPC 2.x since it depends on some new modules in BackupPC 3.0.0
(BackupPC::Storage).

You should be able (but I haven't confirmed) to install BackupPC 3.0.0
in a new directory (pointing it at the old data store) and just run

BackupPC_fixupBackupSummary -l HOST

Make sure you keep a copy of the main config.pl file so you
can revert back to it when you restart BackupPC 2.x.

Or you could actually upgrade to BackupPC 3.0.0, although I don't
recommend that right now since it's best not to change too many
things when you are trying to recover from a problem.

Craig

-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT  business topics through brief surveys -- and earn cash
http://www.techsay.com/default.php?page=join.phpp=sourceforgeCID=DEVDEV
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] how to retrieve last backup for a host (from command line)?

2006-10-11 Thread Craig Barratt
Tomasz writes:

 I have X hosts, and would like to write a script to retrieve last 
 backups from these hosts, and save it as host1.tar, host2.tar etc. - to 
 later write it to a tape.
 
 What is the best method to do that?
 
 Should I just tar the current BackupPC archives like below?
 
 tar -cf host1-last.tar /backuppc-data/pc/hosts/host1/last_backup
 tar -cf host1-last.tar /backuppc-data/pc/hosts/host2/last_backup
 
 Or is there a better method?
 
 The above seems to have an advantage that I don't have to repack 
 everything (i.e., uncompress from BackupPC format, than make a tar 
 package out of it) - but I'm not sure if I can restore such a backup 
 reliably later on.

You could use the archive feature, or run BackupPC_tarCreate.

Running tar directly means that the file names and attributes
won't be correct, the contents will be compressed (if enabled),
and if you run it on an incremental you won't get all the files.

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] how to rebuild pc/hostname/backups file

2006-10-11 Thread Craig Barratt
Stian Jordet writes:

  Umm, sorry to nag you with this again, but I haven't taken a backup for
  almost two weeks, trying to get this fixed first. Was this the only way,
  and what am I doing wrong? What info does it need that it doesn't find?
 
 I think I found why it wasn't working. You (seem) to have a typo in line
 171 of BackupPC_fixupBackupSummary. It should have been $3, not $2. 
 
 And in line 157 I think you should have $1, not $str.

You're right on both counts.  And the code to write the file out
wasn't there.  I somehow ended up with an incomplete version in
the 3.0.0beta release.

 And third, somehow I need to have the parsedate statements outside the
 array to make them work. Have no idea why (I know absolutely nothing
 about perl whatsoever!).

I don't see that issue.

 And so on. While this may seem right (for what I know, that is), I
 wonder why they are sorted wrong, and fillFromNum on fulls are totally
 weird, but that doesn't matter, probably?

fillFromNum matches the num (backup number), not the index.
I'm not sure why the sorting isn't working - they should be
in num order.

 But what I _really_ wonder about, is how to make this a pc/$host/backups
 file? :)

I added the write function $bpc-BackupInfoWrite($host, @Backups).

I'm attaching a corrected version.  To run this you will need
to change the 'use lib /usr/local/BackupPC/lib;' at line 41
to the correct path for you installation.

Craig

BackupPC_fixupBackupSummary
Description: Binary data
-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] disk too full

2006-10-15 Thread Craig Barratt
Paul Fox writes:

 my backup pool disk is 96% full, and backuppc has stopped doing
 backups.  i have no problem with that.
 
 i don't look at the PC status page every day, and only found out
 that things were amiss when i got one of the your machine hasn't
 been backed up for a week messages.  it turns out that none of
 my machines have been backed up in that time.
 
 what bothers me is that it never told me.  there's no notice on
 the status page that the disk is too full, and no mail was sent
 when the disk filled up.  i'd think a message to the effect of
 your PC or laptop was not backed up because there is not enough
 space left on the backup server would be entirely appropriate.

Looks like you haven't setup the admin email, $Conf{EMailAdminUserName}.

If any hosts are skipped due to the disk being too full the admin
address will get a nightly email.

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Cannot queue restores

2006-10-29 Thread Craig Barratt
Jacob writes:

 I am running BackupPC 2.1.2pl2. I have a problem that I cannot queue
 multiple restores. I request a direct restore of a users files, and that
 job qoes into the queue. I then request another restore while the first
 one is running. I get a message:
 
 but a job is currently running, so this request will start later.
 
 I then look in the 'current queues' areas and it is empty - suggesting
 there are no other jobs in the queue. But I know there are because I just
 requested one.  I look in the pc's folder, and see various
 restoreReq.xx files. Every time I request another queued restore, a
 restoreReq file appears with the correct restore info.
 
 But the queue is empty, and no scheduled restores complete ever, only
 those I request when there are none others running.
 
 The restoreRequests just stay in the folder and nothing happens.
 
 How can I get the restores to be correctly scheduled ?  I have 40
 different files to restore after a disk failure.

Sorry about the delay in replying.  This bug is fixed in BackupPC 3.0.0beta.

If you need a fix for 2.1.2 I can tell you what to change.

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] how to check the size of directory to restore?

2006-11-01 Thread Craig Barratt
Tomasz writes:

 I want to restore the profile of a user.
 
 I would like to know, how big it is.
 
 How can I do it?
 
 Basically, I would like to check the directory size in BackupPC web 
 interface - right now it only says the size is 4096, which is perhaps 
 true from technical point of view, but not from user's point of view.
 
 I know I can du du -sh /backuppc/pc/.../directory - but that will 
 calculate only the compressed size.

Yes, and it won't include many files if you are using an incremental.

Traversing the directory tree to estimate the size of the tree can
be expensive, so, yes, there isn't a way to do it in the cgi interface.

A moderately accurate way is to run BackupPC_tarCreate and pipe the
output into wc.  This isn't exact because of the tar headers and
padding of each file to match the tar block size.

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] archive command line usage problem.

2006-11-01 Thread Craig Barratt
Shawn writes:

 I hope this is something simple -- I can't get my archive to run via cron
 (as user backuppc).  Nor will it run manually in a terminal.
 
 My command is:
 /usr/share/backuppc/bin/BackupPC_archiveHost
 /usr/share/backuppc/bin/BackupPC_tarCreate /usr/bin/split /usr/bin/par2
 emailpool -1 /bin/gzip .gz 7000 /var/lib/backuppc/backuparchive 5 *
 
 and the output is always:
 Usage: /usr/share/backuppc/bin/BackupPC_archiveHost tarCreatePath splitPath
 parPath host bkupNum \
   compPath fileExt splitSize outLoc parFile share
 
 I've checked the paths and spaces and everything appears correct.  Running
 the archive via the web interface works fine.  Copy/pasting the command used
 by the web interface from the log files onto a terminal line produces the
 same result - just the command summary.  sudo -u backuppc does the same
 thing as does running it as a user or root.  The operating system is ubuntu
 6.06.
 
 -1 means last complete backup correct?  (I've changed this number anyways
 and it doesn't seem to matter.)  split, gzip, par2 are all in the folders as
 indicated and appear to be working perfectly as the web interface works
 great.

If you run the command manually you need to quote the * at the end, eg:

/usr/share/backuppc/bin/BackupPC_archiveHost\
/usr/share/backuppc/bin/BackupPC_tarCreate /usr/bin/split /usr/bin/par2 \
emailpool -1 /bin/gzip .gz 7000 /var/lib/backuppc/backuparchive 5 *

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] mothballing a host's previous contents

2006-11-01 Thread Craig Barratt
Paul writes:

 i'm pretty confident i'm okay on this, but just to be sure:
 
 i have a host that i've reinstalled with a new OS.  i want to
 keep its former backups around for quite a while, in case i need
 something.  the hostname remains the same (stump).  in the
 backuppc configs, i've done this:
 
 mv pc/stump pc/oldstump
 mkdir pc/stump
 cp pc/oldstump/config.pl pc/stump
 chown -R backuppc:backuppc pc/stump
 
 and then i edited conf/hosts to add a line for oldstump.
 
 is this sufficient?

Yes, that should work.

 there are other places where a machine's hostname is stored which
 might cause the two trees' contents to be confused somehow later on?

There's one other place, but it doesn't matter.  BackupPC maintains
status per host, so the new stump will still have the old machine's
last backup status.  Since it's just used for display in the host
summary and host detail it doesn't matter.  Within a day or so the
new host will have the correct status.  If you really want to fix this
you will need to stop BackupPC, edit log/status.pl, and then restart it.
But it's not worth the trouble.

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Full Periods mean

2006-11-02 Thread Craig Barratt
Ariyanto writes:

 I have implement backuppc to backup my servers and
 doing great. But I have a
 silly question in mind, what is the meaning
 $Conf{FullPeriod} = 6.97? Is it
 mean a whole week? Can I just put it as 6 without .97?

The goal here is to keep the schedule at the same time of day.
With an hourly WakeupSchedule, 6.97 is a full week, minues a
few minutes, which will tend to keep the backup at the same
hour each week.

If you use 7 (exactly a week) it is likely that the backup will slip
an hour.  For example, if the WakeupSchedule includes 1pm, and the
last full backup started at 1pm plus a few seconds, then this week
at 1pm if BackupPC_dump starts a few second earlier, then the
difference is just less than 7.0, so the full won't happen that
hour.

 What if I want to backup the server monthly? Can I
 just put it 30 as
 fullperiod?

You should do 29.97.  That will be every 30 days, not exactly
monthly but close enough.

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] can I define per-PC directories to backup?

2006-11-02 Thread Craig Barratt
Mikael writes:

 I just installed BackupPC 2.1.2 and I can't find how to define per-PC
 based directories to backup. At the moment it just backups the same
 directories from all hosts.

 The web page mentions that this should be possible. Is it only in the
 version 3 beta?

You can do this in 2.x and 3.x.  You need to create a config.pl file
in a particular host's directory (eg: /data/BackupPC/pc/HOST/config.pl).
You only need to put settings in that file that are different (override)
the master config.pl.

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] rsync and smbclient problems

2006-11-05 Thread Craig Barratt
Stephen Hemminger writes:

 I am running backuppc server on Ubuntu Dapper package (v2.1.2)
 
 My linux (rsync) clients fail without getting filelist.
 
 XferLog:
 Contents of file /var/lib/backuppc/pc/deepthought/XferLOG.bad, modified 
 2006-11-04 23:12:01
 
 Running: /usr/bin/ssh -q -x -l root deepthought /usr/bin/rsync --server 
 --sender --numeric-ids --perms --owner --group --devices --links --times 
 --block-size=2048 --recursive --checksum-seed=32761 --exclude=/tmp 
 --ignore-times . /home/

Replace --devices with -D.

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Upgrade to 3.0.0 beta

2006-11-07 Thread Craig Barratt
Riaan writes:

 Was wondering the same thing myself, also when will version 3 go out of
 beta/when will a final version of 3 go out ?

I will do one more 3.0.0 beta by the end of this month.
That should be very close to the final 3.0.0 release.

Even though the 3.0.0 beta releases are quite stable, given the
wide deployment of BackupPC I wanted to have a conservative beta
cycle.

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] broken backups - empty folders with only first letter of its name

2006-11-07 Thread Craig Barratt
Tomasz writes:

  It looks exactly the same in the web interface - one letter folder 
  names, empty inside.
  
 
 I started the backup manually, and it produces the following error:
 
 
 /srv/backuppc/bin/BackupPC_dump -v -f windows_server
 
 (...)
 98208 ( 3688.7 kb/s) \Freigaben\Db\Gl\STICHW.DBF
105472 ( 2512.2 kb/s) \Freigaben\Db\Gl\STICHW.MDX
   2239163 ( 4662.4 kb/s) \Freigaben\Db\Gl\STOFFE.DBF
851968 ( 6073.0 kb/s) \Freigaben\Db\Gl\STOFFE.MDX
   403 (  393.6 kb/s) \Freigaben\Db\Gl\STOFFTYP.DBF
  4096 (  200.0 kb/s) \Freigaben\Db\Gl\STOFFTYP.MDX
 write_socket_data: write failure. Error = Connection reset by peer
 write_socket: Error writing 59 bytes to socket 3: ERRNO = Connection 
 reset by peer
 Error writing 59 bytes to client. -1 (Connection reset by peer)
 Error reading file \Freigaben\Db\Gl\Texte.mem : Write error: Connection 
 reset by peer
 Didn't get entire file. size=21374948, nread=6420960
  21374948 ( 3090.2 kb/s) \Freigaben\Db\Gl\Texte.mem
 Write error: Connection reset by peer opening remote file 
 \Freigaben\Db\Gl\Z (\Freigaben\Db\Gl\)
 Write error: Connection reset by peer opening remote file 
 \Freigaben\Db\Gl\Z (\Freigaben\Db\Gl\)
 Write error: Connection reset by peer opening remote file 
 \Freigaben\Db\G (\Freigaben\Db\)
 Write error: Connection reset by peer opening remote file 
 \Freigaben\Db\g (\Freigaben\Db\)
 Write error: Connection reset by peer opening remote file 
 \Freigaben\Db\g (\Freigaben\Db\)
 
 (...)
 
  directory \Freigaben\a\
 Write error: Connection reset by peer listing \Freigaben\a\*
  directory \Freigaben\a\
 Write error: Connection reset by peer listing \Freigaben\a\*
  directory \Freigaben\b\
 Write error: Connection reset by peer listing \Freigaben\b\*
  directory \Freigaben\b\
 Write error: Connection reset by peer listing \Freigaben\b\*
  directory \Freigaben\b\
 Write error: Connection reset by peer listing \Freigaben\b\*
  directory \Freigaben\b\
 Write error: Connection reset by peer listing \Freigaben\b\*
  directory \Freigaben\d\
 Write error: Connection reset by peer listing \Freigaben\d\*
  directory \Freigaben\d\
 Write error: Connection reset by peer listing \Freigaben\d\*
  directory \Freigaben\f\
 Write error: Connection reset by peer listing \Freigaben\f\*
  directory \Freigaben\f\
 Write error: Connection reset by peer listing \Freigaben\f\*
  directory \Freigaben\f\
 Write error: Connection reset by peer listing \Freigaben\f\*
  directory \Freigaben\g\
 Write error: Connection reset by peer listing \Freigaben\g\*
  directory \Freigaben\g\
 Write error: Connection reset by peer listing \Freigaben\g\*
 
 (...)
 
  directory \S\
 Write error: Connection reset by peer listing \S\*
  directory \W\
 Write error: Connection reset by peer listing \W\*
 tar: dumped 33357 files and directories
 Total bytes written: 12977635840

Hmmm.  Smbclient is getting quite messed up.

I'd recommend running chkdsk on the windows machine.  There's a
change the problem is caused by underlying corruption on the
client machine's file system.

Unfortunately smbclient doesn't provide a consisent set of
error messages, so it's hard for BackupPC to tell if things
are ok or not.  I can add a couple of above message so they
will be considered fatal errors.

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] SMB backup failing

2006-11-08 Thread Craig Barratt
Jason writes:

 Hi all.  Nobody has responded to my other messages requesting help, so 
 I'm trying again.  I'm using the 2.1.2 version.
 
 I have one Windows machine that is backing up flawlessly (other than 
 NT_SHARING_VIOLATIONs that are unavoidable).  I have another that is 
 failing when it gets to a very large file.  Because it failed before, I 
 increased the ClientTimeout to 72 (10x higher than default).  I 
 deleted everything about the host that was failing and started again. 
 
 Contents of file /var/backuproot/pc/mothership/LOG, modified 2006-11-06 
 18:52:34
 
 2006-11-05 02:46:59 full backup started for share CDrive
 2006-11-06 18:52:28 Got fatal error during xfer (Unexpected end of tar 
 archive)
 2006-11-06 18:52:33 Backup aborted (Unexpected end of tar archive)
 2006-11-06 18:52:34 Saved partial dump 0
 
 This took 40 hours to run, and backed up a lot, but when it got to a 12gb 
 file, it choked.
 
 Here's the XferLog Errors:
 
 Error reading file \video\2005\video2005raw.avi : Call timed out: server did 
 not respond after 2 milliseconds
 Didn't get entire file. size=12776697964, nread=-1340081072

The first problem is the timeout.  That could be due to anti-virus SW
that completely checks a file (or directory) before you can open the
file.  You can increase the timeout but you need to recompile smbclient.

The second problem is a 32-bit roll-over bug in smbclient that only happens
after you get the Didn't get entire file error when smbclient tries to pad
the tar file.  A bug and fix are filed on samba's bugzilla, but I doubt the
fix has been applied yet.

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Upgrade to 3.0.0 beta

2006-11-08 Thread Craig Barratt
David writes:

  On 11/7/06, Craig Barratt [EMAIL PROTECTED] wrote:
   I will do one more 3.0.0 beta by the end of this month.
   That should be very close to the final 3.0.0 release.
  
   Even though the 3.0.0 beta releases are quite stable, given the
   wide deployment of BackupPC I wanted to have a conservative beta
   cycle.
  
  I personally would like to see more frequent beta releases that
  quickly address any known issues.
  
  Well, are there any known issues with the current beta? I would like
  to start testing 3.0, but IIRC there was some issues with the first
  beta so I was waiting for the 2nd one...
  
  -Dave
 
 I've been using 3.0.0 for a week or so and it seems to work fine.  That
 being said, there are 2 caveats I think should be mentioned.
 
 First, the default location for config files has changed.  The main
 config file is in /etc/BackupPC and the individual host configs are
 in /etc/BackupPC.  I assumed that the new version would migrate
 from the old without incident.  Obviously I should have read the docs
 more closely!

If you upgrade to 3.0.0 (by telling configure.pl the path to the
current 2.x config.pl file) then all the config files will be used
in place.  The new location (below /etc/BackupPC) is only used on
a new install.

 Second, which relates somewhat to the item above, is that the default
 full backup count is 1 (one).  If, as happened to me, the install is
 done and doesn't find the old config files, some old backups may be
 deleted.

It sounds like you did a new install, rather than an upgrade.

 Craig:  might it not be a good idea to set the default backup count to
 a big number to avoid this problem?

I don't think that is necessary.

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Windows volume/shadow copies

2006-11-08 Thread Craig Barratt
Stephen Joyce writes:

 Is anyone doing windows backups (including open files) using the volume 
 shadow copy service?

 It seems that it shouldn't be too hard to even do bare-metal restores by 
 combining VSS with fileacl or setacl to record  restore NTFS ACLs on 
 files and dirs.
 
 Does anyone have a recipe for doing this, before I re-invent the wheel?

Yes, VSS looks like the most promising way to backup all WinXX files.

I'm attaching a few relevant emails from the mail list.

Craig

-- Forwarded message --
To:   backuppc-users@lists.sourceforge.net
From: Elias Penttilä [EMAIL PROTECTED]
Date: Mon, 10 Apr 2006 14:52:05 +0300
Subj: [BackupPC-users] rsync with VSS-support

Hi,

this is something I did a couple of days ago for our company's use, I 
thought it would maybe help someone else too.

Basically this enables rsync to send files which are locked, e.g. in-use 
Outlook .pst-files, the Windows registry and pretty much everything else 
except EFS-encrypted files.

Everything can be found here: http://iki.fi/~eliasp/rsync-vss

The most useful thing is probably the rsync.msi-file which will install 
rsync under C:\Program Files\rsync. A zip-file is also provided which 
includes everything in the installer plus the installer as a WiX-source 
file.

NOTE that this only works on Windows XP currently, I tried it on Server 
2003, but for some reason it didn't work.

There is a patch file which can be applied to the rsync 2.6.6 source 
found in cygwin. It's pretty simple stuff actually. Note that it needs a 
C++-compiler as I didn't have time to fiddle with the C-style COM. There 
were some other issues too, like the linker not finding 
CreateVssBackupComponents directly and fstat not working for files 
opened with CreateFile. You have to install Microsoft's VSS SDK and copy 
the include files and libraries to rsync-2.6.6/vss/inc and 
rsync-2.6.6/vss/lib respectively.

-- Forwarded message --
To:   Nicholas Hall [EMAIL PROTECTED],
  Elias Penttilä [EMAIL PROTECTED]
From: Kris Boutilier [EMAIL PROTECTED]
Cc:   backuppc-users@lists.sourceforge.net
Date: Thu, 13 Apr 2006 12:00:03 -0700
Subj: RE: [BackupPC-users] rsync with VSS-support

 -Original Message-
 From: [EMAIL PROTECTED] 
 [mailto:[EMAIL PROTECTED] On Behalf 
 Of Nicholas Hall
 Sent: Thursday, April 13, 2006 11:33 AM
 To: Elias Penttilä
 Cc: backuppc-users@lists.sourceforge.net
 Subject: Re: [BackupPC-users] rsync with VSS-support
 
 On 4/10/06, Elias Penttilä [EMAIL PROTECTED] wrote:
  ...
  NOTE that this only works on Windows XP currently, I tried it on 
  Server 2003, but for some reason it didn't work.
 
 Does anyone know what's involved with getting this to work on 
 2000/2003 server?
 

The Volume Shadow Copy Service was only introduced with Windows 2003 
Server/Windows XP - 2000 has no such facility without purchasing third party 
glue such as St. Bernard Open File Manager 
(http://www.stbernard.com/products/ofm/products_ofm.asp).

However, if someone knows a technique to shoehorn the Microsoft VSS server 
service into a Windows 2000 server I'd love to hear about it.

Kris Boutilier
Information Services Coordinator
Sunshine Coast Regional District

-- Forwarded message --
To:   Kris Boutilier [EMAIL PROTECTED]
From: Les Mikesell [EMAIL PROTECTED]
Cc:   Nicholas Hall [EMAIL PROTECTED],
  Elias Penttilä [EMAIL PROTECTED],
  backuppc-users@lists.sourceforge.net
Date: Thu, 13 Apr 2006 14:53:59 -0500
Subj: RE: [BackupPC-users] rsync with VSS-support

On Thu, 2006-04-13 at 14:00, Kris Boutilier wrote:
  
  Does anyone know what's involved with getting this to work on 
  2000/2003 server?
  
 
 The Volume Shadow Copy Service was only introduced with Windows 2003 
 Server/Windows XP - 2000 has no such facility without purchasing third party 
 glue such as St. Bernard Open File Manager 
 (http://www.stbernard.com/products/ofm/products_ofm.asp).
 
 However, if someone knows a technique to shoehorn the Microsoft VSS server 
 service into a Windows 2000 server I'd love to hear about it.

Before anyone gets too carried away with this, they should probably
find out if it is legal to distribute.  Rsync is covered by the
GPL which demands that all parts of a work must be redistributable
under GPL terms or it can't be distributed at all.  I assume that
the VSS SDK code can't be.  There's an exception for components that
are normally distributed with the operating system that might apply
since this is a download from Microsoft and the download procedure
checks to see if you are running windows.  However, I think the
exception requires dynamic linking and for each user to obtain
his own copy of the non-gpl'd component separately.

-- 
  Les Mikesell
   [EMAIL PROTECTED]

-- Forwarded message --
To:   Carl Wilhelm Soderstrom [EMAIL PROTECTED]
From: Travis Fraser [EMAIL PROTECTED]
Cc:   backuppc-users@lists.sourceforge.net
Date: Wed, 24 May 2006 17:52:33 -0400
Subj: Re: 

Re: [BackupPC-users] odd 3.0.0 messages

2006-11-10 Thread Craig Barratt
David writes:

 The following messages are in my BackupPC log:
 
 2006-11-10 03:58:39 Botch on admin job for  admin : already in use!!
 2006-11-10 03:58:39 Botch on admin job for  admin : already in use!!
 2006-11-10 03:58:39 Botch on admin job for  admin : already in use!!
 2006-11-10 03:58:40 Botch on admin job for  admin : already in use!!
 2006-11-10 03:58:40 Botch on admin job for  admin : already in use!!
 
 What do they mean?

Strange.  It means that it is trying to start anotherly nightly admin
job while the previous one is still running.  Is BackupPC_nightly taking
24 hours to run?

Can you email me off-list the last two days (today and yesterday) of
LOG files?

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] rsync fatal error: md4 doesn't match on retry; file removed

2006-11-11 Thread Craig Barratt
Dale writes:

 It does not look like any of the files are changing during the backup.
 
 Also, I think this only started happening since we upgraded BackupPC from 2 
 to 3.  It is happening on all our backup servers as well, not just one.
 
 Anything I can do to help you debug this, let me know.

Ok.  Could you please do a backup with a small list of files that includes
one that fails (or ideally a backup that includes just the single file that
fails), and increase $Conf{XferLogLevel} to, say, 8?  Please send me the
XferLOG file off-list.  It might be quite large.

Are you using rsync checksum caching (ie: --checksum-seed=32761 included
in $Conf{RsyncArgs})?  If so, you could try removing that option.

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC Beta 3.0 config.pl error

2006-11-11 Thread Craig Barratt
Jerry writes:

 Been using BackupPC Beta 3.0 for a while, its great to be able to view and
 edit right from the web interface.  I don't know if anyone else has a
 problem with the config.pl file after changing it through the web interface.
 Error comes back, can't read config.pl file.  This is what I have found to
 be the problem:

Thanks for figuring this out.  This was also reported by Les Stott.
It will shortly be fixed in CVS and will be in the 3.0.0beta2 release.

Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


[BackupPC-users] File-RsyncP-0.66 released on CPAN and SF

2006-11-12 Thread Craig Barratt
I just released File-RsyncP-0.66 on CPAN and SF.
This is a bug fix release.  Here are the changes:

  - Support turning off --perms option, reported by Cameron Dale.

  - Applied patches from Mark Weaver to handle skipping duplicate
file names.

  - Added FileList/snprintf.c to handle solaris configure failures.

Enjoy!
Craig

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


<    1   2   3   4   5   6   7   8   9   10   >