Re: [BackupPC-users] Troubles with 2 Snow Leopard clients using tar over ssh

2009-09-28 Thread Craig Barratt
Steven writes:

 could this have to do with the fact that OSX swtiched the default tar
 from gnutar to bsdtar with Snow Leopard?
 
 http://discussions.apple.com/thread.jspa?threadID=2144311tstart=0
 
 I think gnutar is still there, you might just have to change the
 arguments for calling.

Thanks, you're right - the default is now bsdtar.  BackupPC_dump watches
the output from tar and counts the number of lines that start with ./.
Unfortunately bsdtar generates output a ./ for each file added to the
archive, so BackupPC_dump thinks there are no files being archived.

The best option is to change $Conf{TarClientPath} on Snow Leopard
clients from tar to gnutar (eg: /bin/gnutar, assuming that's the
right path).

Craig

--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Beta version - missing menu items

2009-09-28 Thread Stephen Vaughan
Hi,

Have installed the latest build of the beta version and I've noticed some
things have disappeared from the left menu. Edit Config and Log File, and a
bunch of others aren't available, is that normal?

-- 
Best Regards,
Stephen
Sent from Sydney, Nsw, Australia
--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Exponential expiring incremental backups with IncrKeepCnt?

2009-09-28 Thread Christian Neumann
Hi there,
 
I'm recently trying to replace our only hard-links based backup solution
(snapback) by something more advanced like BackupPC. I got really far, but
now there is one thing I don't understand. 
 
The documentation mentions exponential expiring incremental backups
(http://backuppc.sourceforge.net/faq/BackupPC.html#backup_basics): BackupPC
can also be configured to keep a certain number of incremental backups, and
to keep a smaller number of very old incremental backups.
 
But as soon as I try to enter an array for IncrKeepCnt just as it is
described for FullKeepCnt
(http://backuppc.sourceforge.net/faq/BackupPC.html#what_to_backup_and_when_t
o_do_it) I get an error message saying that IncrKeepCnt must be an
integer.
 
Are exponential expiring incremental backups supported? If not, is there a
reason behind it?

Thanks a lot for answers,
christian


--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backing up a BackupPC host - *using rsync+tarPCCopy*

2009-09-28 Thread Fernando Laudares Camargos
Hello Holger and Les,

Holger Parplies a écrit :
 Hi,
 
 [can we agree on avoiding tabs in subject lines?]
 
 Les Mikesell wrote on 2009-09-25 23:25:35 -0500 [Re: [BackupPC-users] Backing 
 up a BackupPC host - *using rsync+tarPCCopy*]:
 Fernando Laudares Camargos wrote:
 [...]
 I'm doing two things (altough I'm not sure that answer your question 
 correctly):

 1) rsync of cpool without --delete (so, cpool will keep growing, no files 
 will ever be deleted. I assume that's fine apart from the fact it will take 
 more disk space).
 BackupPC_nightly may rename chains of hash collisions in cpool as part of 
 its 
 cleanup.  If such a rename occurs between the rsync runs and the 
 BackupPC_tarPCCopy or restore, you'll end up with links to the wrong files.

I wasn't aware of the fact BackupPC_nightly renames chains of hash collisions 
in cpool so, indeed, it's not as harmless as I first thought ...

 actually, I don't believe you even need that to happen for problems to occur.
 
 As far as an rsync pool update is concerned, the contents of some pool files
 will have changed if a chain gets renumbered. rsync has no concept of renamed
 files, and even if it did, from looking at the pool alone it couldn't know
 what to do (because that depends on the other links pointing to the file).

Ok, one more point to consider in the approach I'm using ...

 If you are using --inplace, I believe the destination pool files will be
 overwritten, thereby making *previously existing links to them* point to
 incorrect content. You're probably not doing that, so you will probably only
 have the pool file deleted and replaced with a new one with new contents. As a
 result, the existing links in the pc/ directories will no longer take part in
 pooling in your copy. You'll have a new independant copy of the contents under
 the new pool file name which subsequent backups might link to (providing it's
 not renamed again). I really don't see you gaining anything from running rsync
 *without* --delete. With --delete, you could at least expire backups from your
 copy (i.e. pc/host/num/ trees) and get back some space (well, more space,
 really, because you get back some space from files severed from pooling by
 chain renumbering as described above).

I'm not using --inplace and I see your (valid) point for using --delete.

 What exactly are you trying to do, anyway?
 
 1. Have a copy of the pool that BackupPC could run on if the original pool is
lost, or
 2. have a copy of the pool suitable for *restoring files only* if the original
pool is lost, or
 3. something else?
 
 You're not achieving (1), though (2) would probably work.

What I'm trying to do is to have 2). Actually, for what I have read in this 
list, the desire to have a backup of the data in the main BackupPC server is 
common among many users. To have two independent backup servers located in 
different sites would place double load in the clients and sometimes that is 
not feasible (if the backup takes all night to conclude, for example) as oposed 
to concentrate the load of the secondary backup in the main backup server.

So, to get back to your question, what we're trying to accomplish is to have a 
synchronized copy of the data (cpool + backup sets of pcs) in the main BackupPC 
in a separate server. If we lost the main server we would like to do both:
1) be able to restore files
2) start using the secondary server to make the Backups until we can recover 
the main server

The situation described bellow could be accomplish using DRBD+Heartbeat when 
you have a really good network connection between the primary and the secondary 
backup servers, which is not our case most of the time.

In fact, if we could garantee all files are in the cpool and we could have a 
way to identify then in the repository (using a database to relate the md5sum 
to a file name, for instance), that could solve part of the problem. We would 
only need to rsync the cpool and, in case of a disaster, we could at least 
manually recover the essential files. It's not a complete solution but one that 
would fit well in some cases.

I'm going to try Jeffrey's script to re-execute linking today and see how that 
modifies the size of the tar files created with BackupPC_tarPCCopy.

 How much more disk space have you got for your copy?

Not that much more, around 15%, but then the system has not been used long 
enough and this rate will surelly become more important with time.

I'm glad we're taking the time to discuss that again, I'm sure it will benefit 
a lot of people using this great software that is BackupPC.

Regards,
-- 
Fernando Laudares Camargos

  Révolution Linux
http://www.revolutionlinux.com
---
* Tout opinion et prise de position exprimée dans ce message est celle
de son auteur et pas nécessairement celle de Révolution Linux.
** Any views and opinion presented in this e-mail are solely those of
the author and do not necessarily represent those of 

[BackupPC-users] problem purging files with pool size

2009-09-28 Thread backuppc
hi there.

i'm having a problem purging files and with the size of my pool.

i'm running version 2.1.2pl0.

in the past, i've modified the $Conf{FullKeepCnt} so that it's more
conservative, and then run BackupPC_nightly, and it's trimmed the pool. 
this is no longer working.

for example, here are the relevant lines from my config file (note that
the $Conf{FullKeepCnt} used to be $Conf{FullKeepCnt} = [2,1,1,0,1];:

$Conf{FullKeepCnt} = [2];
$Conf{FullKeepCntMin} = 1;
$Conf{FullAgeMax} = 30;
$Conf{IncrKeepCnt} = 1;
$Conf{IncrKeepCntMin} = 1;
$Conf{IncrAgeMax} = 30;
$Conf{PartialAgeMax} = 3;
$Conf{IncrFill} = 0;

yet backups are showing like this for a particular host.

112  fullyes 12/16 20:02 78.9285.5  
/home/backuppc/pc/prettylady/112
201 fullyes 4/6 20:04   106.1   174.5   
/home/backuppc/pc/prettylady/201
310 fullyes 7/28 20:00  192.8   61.5
/home/backuppc/pc/prettylady/310
338 fullyes 8/25 20:10  184.8   33.5
/home/backuppc/pc/prettylady/338
345 fullyes 9/1 20:10   189.9   26.5
/home/backuppc/pc/prettylady/345
351 incrno  9/7 20:04   7.7 20.5
/home/backuppc/pc/prettylady/351
352 fullyes 9/8 20:05   210.7   19.5
/home/backuppc/pc/prettylady/352
353 incrno  9/9 20:00   5.2 18.5
/home/backuppc/pc/prettylady/353

what's going on?

here's the disk space report:

/dev/sda1 147G  135G  4.5G  97% /home/backuppc

as a temporary measure, i think i can manually delete files by, for
example, deleting all the files in /home/backuppc/pc/prettylady/112 (for
the above host)?  and then running nightly?

thanks.


--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] OT: (e.g.) sed command to modify configuration file

2009-09-28 Thread Chris Robertson
Timothy J Massey wrote:
 Hello!

 I have a shell script that I use to install BackupPC.  It takes a standard 
 CentOS installation and performs the configuration that I would normally 
 do to install BackupPC.  There are probably way better ways of doing this, 
 but this is the way I've chosen.

 As part of this script, I use sed to modify certain configuration files. 

Why modify, when you can replace?


cat  /etc/ssh/sshd_config  EOF
# This is my sshd_config.
# There are many like it, but this one is mine...
Protocol 2
PermitRootLogin no
EOF


Be aware, this is not a complete list of options.  egrep -v (^#|^$) 
/etc/ssh/sshd_config (before you run the above cat!) is more likely to be.

Chris


--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] problem purging files with pool size

2009-09-28 Thread Holger Parplies
Hi,

backu...@omidia.com wrote on 2009-09-28 08:48:44 -0700 [[BackupPC-users] 
problem purging files  with pool size]:
 i'm having a problem purging files and with the size of my pool.
 
 i'm running version 2.1.2pl0.
 
 in the past, i've modified the $Conf{FullKeepCnt} so that it's more
 conservative, and then run BackupPC_nightly, and it's trimmed the pool. 

I don't believe that is actually true. Backup expiration is done by
BackupPC_dump, not by BackupPC_nightly. I believe your problem with 2.1.2 is
that no dumps (and no expiration) are done when the pool FS is more than
DfMaxUsagePct full (actually I don't have the 2.1.2 source here, but in 2.1.1
that is the case; in 3.0.0beta3 it's fixed; the changelog doesn't seem to say
in which version it was changed).

 [...]
 here's the disk space report:
 
 /dev/sda1 147G  135G  4.5G  97% /home/backuppc

You might try temporarily increasing $Conf{DfMaxUsagePct} to 97 or 98 (and
then run a backup or wait for one to run automatically). Depending on how
large your backups typically are, you might even keep it there (7.35GB (5% of
147GB) is a lot of space to keep reserved - unless your backups are typically
that large; how much space do you need so that $Conf{MaxBackups} backups can
be started and complete without the FS filling up?).

Also note that backups don't seem to be expired for hosts for which backups
are disabled.

 as a temporary measure, i think i can manually delete files by, for
 example, deleting all the files in /home/backuppc/pc/prettylady/112 (for
 the above host)?  and then running nightly?

Presuming you don't miss any dependencies (incremental backups that are based
on that full backup), you can do that (though you'll still have an entry for
that backup in the backups file) - move the directory to $TopDir/trash and
trashClean will even do it for you in the background - but it's safer to let
BackupPC handle expiration.

Regards,
Holger

--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] unclean pool

2009-09-28 Thread Tony Schreiner
Perhaps my situation is related to some recently posteds, maybe not.


I have been running low on space on my __topdir__ volume, so several
times in recent weeks, I have moved numbered backup directories to the
__topdir__/trash folder and removed the appropriate line from the
corresponding backups file.

This has recovered a small amount of space but not nearly as much as I
expected. Today I looked at the pool directory and found that many (even
a majority) of the files there have a single link, shouldn't
BackupPC_nightly be removing those?

And as a follow up, are those file candidates for me to remove if
they're not being removed properly?

BackupPC 3.1.0

Tony Schreiner

--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Exponential expiring incremental backups with IncrKeepCnt?

2009-09-28 Thread Holger Parplies
Hi,

Christian Neumann wrote on 2009-09-28 15:57:24 +0100 [[BackupPC-users] 
Exponential expiring incremental backups with IncrKeepCnt?]:
 [...]
 The documentation mentions exponential expiring incremental backups
 (http://backuppc.sourceforge.net/faq/BackupPC.html#backup_basics): BackupPC
 can also be configured to keep a certain number of incremental backups, and
 to keep a smaller number of very old incremental backups.

while I don't really understand what keep a smaller number of very old
incremental backups is supposed to mean, there is no mention of exponential
incremental backup expiry. If you read the preceeding paragraph on full
backups you'll notice that it's described very explicitly there. If there were
exponential expiry of incrementals, there would be at least a clear reference
to this description.

 [...]
 Are exponential expiring incremental backups supported? If not, is there a
 reason behind it?

Exponential expiry of incremental backups really makes no sense (and it's not
sanely implementable with multi-level incrementals anyway). With BackupPC, you
*need* regular full backups(*) (if the wiki were functional, there would
probably be a page explaining why), and storing full backups is only
insignificantly more expensive than storing incrementals anyway. For this
reason, incremental backups are always fairly young (mine are up to 60 days
old, and I doubt anyone keeps them much longer). To keep an incremental backup,
you also need to keep the full backup it was made against(!), so the age
difference between the incremental and its full will never exceed
$Conf{FullPeriod} (the time between two full backups). With exponential
incremental backup expiry, you would quickly exceed $Conf{FullPeriod}, meaning
you would be keeping full backups (if only to support the incrementals) that
are closer to the incrementals than they are to each other. Why would you want
that?

Incremental backups are there for gaining a speed advantage - an advantage
that will allow you to make daily (or hourly or whatever) backups. Full
backups are (amongst other purposes) for keeping exponentially - yearly
backups for the last 10 years, monthly for the last two years, weekly for the
last six months (just to give you an idea). As with any backup system,
incremental backups are only a (good enough) approximation. Only full backups
give you a true snapshot (and that only if they are, in fact, taken of a
snapshot, but that's a different topic). You want to keep true snapshots
around for a long time, not approximations.

Regards,
Holger

(*) Actually, you probably need regular full backups with any backup scheme.
It's just that on this list, we make a point of telling you ;-).

--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] unclean pool

2009-09-28 Thread Holger Parplies
Hi,

Tony Schreiner wrote on 2009-09-28 20:01:24 -0400 [[BackupPC-users] unclean 
pool]:
 I have been running low on space on my __topdir__ volume, so several
 times in recent weeks, I have moved numbered backup directories to the
 __topdir__/trash folder and removed the appropriate line from the
 corresponding backups file.
 
 This has recovered a small amount of space but not nearly as much as I
 expected. Today I looked at the pool directory and found that many (even
 a majority) of the files there have a single link, shouldn't
 BackupPC_nightly be removing those?

yes, it should. Is BackupPC_nightly being run (check the log files under
$LogDir)? What are the values of

$Conf{BackupSchedule}
$Conf{MaxBackupPCNightlyJobs}
$Conf{BackupPCNightlyPeriod}

?

 And as a follow up, are those file candidates for me to remove if
 they're not being removed properly?

Well, yes, but you really need to fix whatever problem you're having rather
than get yourself a night job as replacement for BackupPC_nightly ;-). You'd
also need to take care of pool chain renumbering and preventing BackupPC_link
from running while you're busy, so it's a bit more difficult than
'find pool cpool -nlinks 1 -exec rm {} \;' ...

Regards,
Holger

--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Trick for Restoring Drupal Website via tar File

2009-09-28 Thread Norbert Hoeller
I tested restoring a Drupal website by having backuppc generate a tar 
file, uploading the file to the server and then extracting the tar file to 
the new Drupal directory structure.  A large number of files were not 
restored because a number of Drupal sub-directories are read-only.  Errors 
included 'Cannot open: Permission denied' and 'Cannot open: No such file 
or directory'. 

I found a reference to the '--delay-directory-restore' option at 
http://www.gnu.org/software/tar/manual/tar.html#SEC77 that solved this 
problem. --
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/