[BackupPC-users] Fw : Re: client's backup's files on extern disk

2009-09-25 Thread KOUAO aketchi


--- En date de : Jeu 24.9.09, Craig Barratt cbarr...@users.sourceforge.net a 
écrit :

De: Craig Barratt cbarr...@users.sourceforge.net
Objet: Re: client's backup's files on extern disk
À: KOUAO aketchi aketc...@yahoo.fr
Date: Jeudi 24 Septembre 2009, 20h03

KOUAO writes:

 I have a server on which BackupPC is installed and this works well on a 
 debian server. But , these days my backup's work space is full (96%) and so, 
 i want to put out some pc's backup files on an usb extern disk. Could you 
 tell me how i can do that.
 Thanks a lot for your help.

Please post your questions to the user mail list.

Craig



  --
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] security headaches

2009-09-25 Thread Tino Schwarze
On Fri, Sep 25, 2009 at 05:51:41AM -0400, Andrew Schulman wrote:

 Here's my problem:  I love having online backups, they're very
 convenient.  But they're a huge security problem.  All of the LAN's
 most sensitive files become readable by user backuppc, who can be
 attacked through the web application.  Worse, all of the files become
 readable by the BackupPC administrative user, and each host's files by
 that host's designated backup owner.  If any of these has a weak
 password, or if the BackupPC login doesn't run over SSL, or if the
 htdigest file is unprotected, then we give away the store.  Root
 security for the whole LAN becomes equivalent to a whole bunch of
 typically weaker links.
 
 My question for you is, how are people addressing this problem?
 Enforcing strong passwords? Limiting the number of users with restore
 rights?  Segmenting your hosts into sensitive and less-sensitive
 files?

Our setup only has administrator access to the backup machine. It's
considered an isolated system where nobody has access, but
administrators. The web interface (which is optional, not neccessary
BTW) is SSL-secured and password protected, of course.

Backup storage is always a very security sensitive part of
infrastructure... And it's always a matter of balancing security vs.
ease of use.

Bye,

Tino.

-- 
What we nourish flourishes. - Was wir nähren erblüht.

www.lichtkreis-chemnitz.de
www.craniosacralzentrum.de

--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Exclude Hidden files (Linux + SSH + tar)

2009-09-25 Thread Holger Parplies
Hi,

kadamba wrote on 2009-09-24 04:34:32 -0400 [[BackupPC-users]  Exclude Hidden 
files (Linux + SSH + tar)]:
 [...]
 I've installed BackupPC on Debian Lenny to backup Ubuntu Desktops (/home for
 all), I need to exclude all hidden files which is taking up lots of space.
 My focus is on user data files.
 
 With reference to this article
 http://www.backupcentral.com/phpBB2/two-way-mirrors-of-external-mailing-lists-3/backuppc-21/how-to-include-certain-files-76171/

(it's not an article, it's a mailing list post, as, I believe, someone sitting
in a glasshouse has promised should be obvious)

 I tried the following (excuse me, i am perl ignorant )
 
 1st. $Conf#123;BackupFilesExclude#125; = #123; '*' = #91;'.*'#93;#125;;
 2nd. $Conf#123;BackupFilesExclude#125; = #123; '*' = 
 #91;'.*'#93;,#125;;
 3rd. $Conf#123;BackupFilesExclude#125; = #123;'*' = #91;'.*', 
 '*/.*'#93;,#125;
 4th. $Conf#123;BackupFilesExclude#125; = #123;'*' = #91;'.*', 
 '*/.*'#93;#125;

Well, first of all, a big *thank you* to Backup Central for mangling that -
it's pretty much useless for *painlessly* debugging. And while I'm at it,
*thank you* (to Backup Central) for not line-wrapping, too (and yes, I know
that people posting directly to the mailing list may also fail to wrap their
lines). And yes, you don't want to wrap lines from log or configuration files.
That's kind of a problem when translating from board to mailing list, isn't
it?
/sarcasm


That said, I don't think it's an error caused by a syntactical mistake. You
seem to be excluding '.*', which includes (err, excludes :) '.' - the root of
the whole transfer. This leads to:

 2. LOG file: Got fatal error during xfer (No files dumped for share /home)

For what you seem to want to do, you should try something like

$Conf {BackupFilesExclude} = {
'*' = [ '+ .', '.*' ],
};

(which is really sort of a hack, because the exclude '+ .' is actually an
include by virtue of starting with '+ '; you could also use a single exclude
pattern like '.[a-zA-Z0-9]*', but you're bound to miss some characters, and
rsync doesn't seem to support things like '.[^.]*' - at least I can't find it
in the man page, feel free to try it out just the same).

You should note, though, that you are excluding any files whose names start
with '.' as well as any directories with names starting with '.' *including all
their substructure* . Traditionally, files and directories starting with '.'
are supposed to be hidden, so you might actually want that. But you should
be aware just the same that you are also excluding files *not* starting with
'.', if any of the directories in the path start with a '.' (eg.
'/.gconf/apps/gthumb/ui/%gconf.xml').

 Other Info:
 1. In Status page Failures that need attention has nothing
 2. In the Host Summary web page it would say done

Strange. Have you got any prior successful backups for the host (without the
exclude), or was this the first attempt? Are you sure you haven't got enough
space for the '.' files? User configuration is *usually* not much data (and
static at that, and partially identical between users (- BackupPC pooling)),
but can be quite some trouble to re-setup in case of loss. There may be notable
exceptions (like caches of web browsers, evolution, thumbnails, trash
directories, ...). Maybe you could specifically exclude those instead of *all*
configuration data? Of course, your mileage may vary.

Regards,
Holger

--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Troubles with 2 Snow Leopard clients using tar over ssh

2009-09-25 Thread jingai
Backups to two of my Snow Leopard clients using tar over ssh are  
failing, while one other Snow Leopard client is successful using the  
same configuration.

The error log is as follows:

Running: /usr/bin/ssh -q -x -n -l root kuramori env LC_ALL=C /usr/bin/ 
tar -c -v -f - -C / --totals --one-file-system --exclude=./Developer -- 
exclude=./Users/jingai/Movies --exclude=./Users/jingai/Music/iTunes/ 
iTunes\\\ Music --exclude=./Users/jingai/tmp --exclude=pr0n -- 
exclude=./Volumes --exclude=.Trash --exclude=./Trash -- 
exclude=./.Trashes --exclude=./cores --exclude=.Spotlight-V100 -- 
exclude=./automount --exclude=./Network --exclude=./private/var/ 
automount --exclude=./private/var/run --exclude=./private/var/vm -- 
exclude=./private/var/tmp --exclude=./private/tmp --exclude=Caches -- 
exclude=CachedMessages ./
full backup started for directory /
Xfer PIDs are now 25906,25905
[ ... SNIP FILE LISTING, SEEMS COMPLETE.. ]
tarExtract: Done: 0 errors, 684376 filesExist, 30478227391 sizeExist,  
20789892662 sizeExistComp, 711718 filesTotal, 31326751741 sizeTotal
Got fatal error during xfer (No files dumped for share /)
Backup aborted (No files dumped for share /)
Not saving this as a partial backup since it has fewer files than the  
prior one (got 711718 and 0 files versus 719791)

I don't see where it's having problems.  Anyone can help?

Thanks,
Jonathan

--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Troubles with 2 Snow Leopard clients using tar over ssh

2009-09-25 Thread jingai
On Sep 25, 2009, at 8:25 AM, Holger Parplies wrote:

 jingai reposted on 2009-09-25 08:01:45 -0400 [[BackupPC-users]  
 Troubles with 2 Snow Leopard clients using tar over ssh]:
 [...]
 Anyone can help?

 if we couldn't yesterday, how likely are we to today?

I am not getting my own posts apparently, so I thought the list never  
received it.  My apologies.

-Jonathan

--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Troubles with 2 Snow Leopard clients using tar over ssh

2009-09-25 Thread Holger Parplies
Hi,

jingai reposted on 2009-09-25 08:01:45 -0400 [[BackupPC-users] Troubles with 2 
Snow Leopard clients using tar over ssh]:
 [...]
 Anyone can help?

if we couldn't yesterday, how likely are we to today?

Regards,
Holger

--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backing up a BackupPC host - *using rsync+tarPCCopy*

2009-09-25 Thread Fernando Laudares Camargos
Hello,

I beg your pardon for bringing this topic to the table again. I have read this 
entire thread (even the scientific debate around thrusting probabilities 
calculus) and have tried the main suggestions detailed by you to solve this 
problem (except the use of raid mirror detailed by Les Mikesel, which does not 
fit our needs).

What worked best for us was the strategy of copying the 'cpool' with an 
standard rsync (v.3) and then use BackupPC_tarPCCopy to re-create the sets of 
backups. I have even refined that approach creating a script to break down the 
rsync of the 'cpool' in multiple rsyncs (per sub-directory) and to smartly 
run BackupPC_tarPCCopy for only the new backup sets of each pc.

This approach works best in the environments we use Coraid boxes and that we 
can mount both the regular BackupPC partition and the backup partition in the 
same server, so as to BackupPC_tarPCCopy directly to the backup partition. In 
the other cases we need to create a tar file with BackupPC_tarPCCopy, copy it 
to the backup server over the network, and then untar the file - that adds a 
new level of complexity to the solution.

Anyway, the first step in this approach (rsync of cpool) works considerably 
fine (for cpools containing a few Terabytes of data). What doesn't always works 
fine is that BackupPC_tarPCCopy sometimes produces tar files that are too big 
(several Gb, as for the backup of Zimbra and database servers), which brings to 
the following question:

* Why BackupPC_tarPCCopy sometimes produces big tar files ? *

For what I understand, after the completion of a backup, BackupPC_link is run 
to transfer all the non-BackupPC-system-files (e.g., attrib files) to the 
'cpool', replacing then by hard links. A tar file originated from a 'linked' 
backup set would then contain mainly a list of relations between the files that 
compose the data set and their relative position in the cpool, so we can use 
the same tool to untar the file and re-create the data set with hard links. 
Which brings to my second and last question:

* Is this (BackupPC_tarPCCopy creating big files) happening because the 
interval between the execution of my backups (the end of one and the start of 
the next one) is not giving enough time to BackupPC to run BackupPC_link and 
finally the files in the data sets are not being 'linked' in the cpool ? *

Ideally I would like to have all files in the cpool so the rsync of this 
directory would make for most of the trouble. If the second question is true 
then it would be great to have a way to manually run BackupPC_link to clean 
the backup sets.

I appreciate your view in those two questions.

Regards,
-- 
Fernando Laudares Camargos

  Révolution Linux
http://www.revolutionlinux.com
---
* Tout opinion et prise de position exprimée dans ce message est celle
de son auteur et pas nécessairement celle de Révolution Linux.
** Any views and opinion presented in this e-mail are solely those of
the author and do not necessarily represent those of Révolution Linux.


Peter Walter a écrit :
 All,
 
 I have implemented backuppc on a Linux server in my mixed OSX / Windows 
 / Linux environment for several months now, and I am very happy with the 
 results. For additional disaster recovery protection, I am considering 
 implementing an off-site backup of the backuppc server using rsync to 
 synchronize the backup pool to a remote server. However, I have heard 
 that in a previous release of backuppc, rsyncing to another server did 
 not work because backuppc kept changing the file and directory names in 
 the backup pool, leading the remote rsync server to having to 
 re-transfer the entire backup pool (because it thinks the renamed files 
 are new files).
 
 I have searched the wiki and the mailing list and can't find any 
 discussion of this topic. Can anyone confirm that the way backuppc 
 manages the files and directories in the backup pool would make it 
 difficult to rsync to another server, and, if so, can anyone suggest a 
 method for mirroring the backuppc server at an offsite backup machine?
 
 Regards,
 Peter
 
 --
 Open Source Business Conference (OSBC), March 24-25, 2009, San Francisco, CA
 -OSBC tackles the biggest issue in open source: Open Sourcing the Enterprise
 -Strategies to boost innovation and cut costs with open source participation
 -Receive a $600 discount off the registration fee with the source code: SFAD
 http://p.sf.net/sfu/XcvMzF8H
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/


--
Come build with us! The BlackBerryreg; Developer Conference in 

[BackupPC-users] catch up hump

2009-09-25 Thread James Ward
My busiest BackupPC server fell very behind before I did some tuning (thanks 
for your help!) after I upgraded it from 2.x to 3.x due to disk contention 
between the nightly admin jobs and backups (which I had running 24x7).  I think 
I have it performing similarly to 2.x now, but it just can't seem to catch up.  
It's taking 7-8 days to make full backups of all the clients, and since the 
full backup period is 6.97, it just keeps making full backups.  Should I 
temporarily or even permanently increase the full backup period to help it get 
to the point that it can do incrementals?

-- 
Ward... James Ward
Linux System Administrator
System Administration
Phone:  520-290-0910 ext 268
Fax:520-546-3442
ICQ:201663408
Jabber: ja...@jabber.chinoc.net

--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Fw : Re: client's backup's files on extern disk

2009-09-25 Thread Michael Stowe

I'm going to start with pointing out that this is a really bad idea.  A
key concept of BackupPC is the idea of pooling, where files shared across
systems are stored only once in the backup pool.  In other words, if I'm
backing up two identical systems, there will be one set of files in the
backup pool, and the idea of moving one of those systems to external
storage is nonsensical, at best.  I must move both, or neither.

As a practical matter, many or most files may not overlap, but hopefully
that helps illustrate how trying to put just one PC's backups somewhere
else is problematic, at best.

If you really want to do such a thing, your best bet is to run two
instances of BackupPC and keep them separate.  I don't think you'll be
happy with USB-attached external disks in any event.

 KOUAO writes:

 I have a server on which BackupPC is installed and this works well on a
 debian server. But , these days my backup's work space is full (96%) and
 so, i want to put out some pc's backup files on an usb extern disk.
 Could you tell me how i can do that.
 Thanks a lot for your help.


--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Periodoc Backup of the pool

2009-09-25 Thread Daniele Davolio
Hi everyone,
is there a way to do a periodic backup of the pool, like if I want to 
store a snapshot of all the backups and keep them on a safe.
I'm quite new to Backuppc and I'm reading around about that.

Thanx!

--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] catch up hump

2009-09-25 Thread Les Mikesell
James Ward wrote:
 My busiest BackupPC server fell very behind before I did some tuning (thanks 
 for your help!) after I upgraded it from 2.x to 3.x due to disk contention 
 between the nightly admin jobs and backups (which I had running 24x7).  I 
 think I have it performing similarly to 2.x now, but it just can't seem to 
 catch up.  It's taking 7-8 days to make full backups of all the clients, and 
 since the full backup period is 6.97, it just keeps making full backups.  
 Should I temporarily or even permanently increase the full backup period to 
 help it get to the point that it can do incrementals?

That would depend on the pattern of data change.  If a lot of files are 
changing the incrementals will keep getting bigger.  If there's not a 
lot of activity, it won't hurt to have longer full intervals.

-- 
   Les Mikesell
lesmikes...@gmail.com




--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Periodoc Backup of the pool

2009-09-25 Thread Daniele Davolio
Hi everyone,
is there a way to do a periodic backup of the pool, like if I want to
store a snapshot of all the backups and keep them on a safe.
I'm quite new to Backuppc and I'm reading around about that.

Thanx!


--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Periodoc Backup of the pool

2009-09-25 Thread Davide Brini
On Friday 25 September 2009 15:03:59 Daniele Davolio wrote:

 Hi everyone,
 is there a way to do a periodic backup of the pool, like if I want to
 store a snapshot of all the backups and keep them on a safe.
 I'm quite new to Backuppc and I'm reading around about that.

Could BackupPC's archive feature be what you're looking for?

http://backuppc.sourceforge.net/faq/BackupPC.html#archive_functions

-- 
D.

--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] multiple pools

2009-09-25 Thread Bharat Mistry
is it possible to have multiple pools

My backupspc has raid1 pair of 1Tb drives - this is now 750gb used

I guess its time to think about adding a 2nd 1Tb pair
--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backing up a BackupPC host - *using rsync+tarPCCopy*

2009-09-25 Thread Jeffrey J. Kosowsky
I have written a program BackupPC_fixLinks.pl that will go through
both your pool and pc chain to find (and fix) duplicate pool files and
missing links. Part of the fix is to run an equivalent to
BackupPC_Link on missing/broken links.

Note you can't just run BackupPC_Link (or its analog) unless you have
a list of files to run it on, which is part of what my program does.

I can send it to you or repost it if you are interested.

Fernando Laudares Camargos wrote at about 08:56:35 -0400 on Friday, September 
25, 2009:
  Hello,
  
  I beg your pardon for bringing this topic to the table again. I have read 
  this entire thread (even the scientific debate around thrusting 
  probabilities calculus) and have tried the main suggestions detailed by you 
  to solve this problem (except the use of raid mirror detailed by Les 
  Mikesel, which does not fit our needs).
  
  What worked best for us was the strategy of copying the 'cpool' with an 
  standard rsync (v.3) and then use BackupPC_tarPCCopy to re-create the sets 
  of backups. I have even refined that approach creating a script to break 
  down the rsync of the 'cpool' in multiple rsyncs (per sub-directory) and to 
  smartly run BackupPC_tarPCCopy for only the new backup sets of each pc.
  
  This approach works best in the environments we use Coraid boxes and that we 
  can mount both the regular BackupPC partition and the backup partition in 
  the same server, so as to BackupPC_tarPCCopy directly to the backup 
  partition. In the other cases we need to create a tar file with 
  BackupPC_tarPCCopy, copy it to the backup server over the network, and then 
  untar the file - that adds a new level of complexity to the solution.
  
  Anyway, the first step in this approach (rsync of cpool) works considerably 
  fine (for cpools containing a few Terabytes of data). What doesn't always 
  works fine is that BackupPC_tarPCCopy sometimes produces tar files that are 
  too big (several Gb, as for the backup of Zimbra and database servers), 
  which brings to the following question:
  
  * Why BackupPC_tarPCCopy sometimes produces big tar files ? *
  
  For what I understand, after the completion of a backup, BackupPC_link is 
  run to transfer all the non-BackupPC-system-files (e.g., attrib files) to 
  the 'cpool', replacing then by hard links. A tar file originated from a 
  'linked' backup set would then contain mainly a list of relations between 
  the files that compose the data set and their relative position in the 
  cpool, so we can use the same tool to untar the file and re-create the data 
  set with hard links. Which brings to my second and last question:
  
  * Is this (BackupPC_tarPCCopy creating big files) happening because the 
  interval between the execution of my backups (the end of one and the start 
  of the next one) is not giving enough time to BackupPC to run BackupPC_link 
  and finally the files in the data sets are not being 'linked' in the cpool ? 
  *
  
  Ideally I would like to have all files in the cpool so the rsync of this 
  directory would make for most of the trouble. If the second question is true 
  then it would be great to have a way to manually run BackupPC_link to 
  clean the backup sets.
  
  I appreciate your view in those two questions.
  
  Regards,
  -- 
  Fernando Laudares Camargos
  
Révolution Linux
  http://www.revolutionlinux.com
  ---
  * Tout opinion et prise de position exprimée dans ce message est celle
  de son auteur et pas nécessairement celle de Révolution Linux.
  ** Any views and opinion presented in this e-mail are solely those of
  the author and do not necessarily represent those of Révolution Linux.
  
  
  Peter Walter a écrit :
   All,
   
   I have implemented backuppc on a Linux server in my mixed OSX / Windows 
   / Linux environment for several months now, and I am very happy with the 
   results. For additional disaster recovery protection, I am considering 
   implementing an off-site backup of the backuppc server using rsync to 
   synchronize the backup pool to a remote server. However, I have heard 
   that in a previous release of backuppc, rsyncing to another server did 
   not work because backuppc kept changing the file and directory names in 
   the backup pool, leading the remote rsync server to having to 
   re-transfer the entire backup pool (because it thinks the renamed files 
   are new files).
   
   I have searched the wiki and the mailing list and can't find any 
   discussion of this topic. Can anyone confirm that the way backuppc 
   manages the files and directories in the backup pool would make it 
   difficult to rsync to another server, and, if so, can anyone suggest a 
   method for mirroring the backuppc server at an offsite backup machine?
   
   Regards,
   Peter
   
   --
   Open Source Business Conference (OSBC), March 24-25, 2009, 

Re: [BackupPC-users] security headaches

2009-09-25 Thread dan
I use iptables and allow access only from my workstation to the web
interface, disable root and backuppc user's inbound ssh.  I also limit
inbound traffic with iptables so the backuppc must open the session to the
client.
--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backing up a BackupPC host - *using rsync+tarPCCopy*

2009-09-25 Thread Les Mikesell
Fernando Laudares Camargos wrote:

 
 * Why BackupPC_tarPCCopy sometimes produces big tar files ? *

Are you doing something to control the timing of your copies compared to 
the changes backuppc would be making?  The files in cpool need to be 
exactly in sync with the directory snapshots made by BackupPC_tarPCCopy.

It sounds like you are hitting at a time with there are still files in 
the 'new' directories that don't have pool links yet, but I'm not sure 
how you can tell when everything is linked.

-- 
  Les Mikesell
lesmikes...@gmail.com


--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] multiple pools

2009-09-25 Thread dan
 is it possible to have multiple pools

 My backupspc has raid1 pair of 1Tb drives - this is now 750gb used

 I guess its time to think about adding a 2nd 1Tb pair


 did you use LVM?  You could rotate the data.  Setup the new raid and put
that device in LVM.  Create a LV and then move all the old data over.  Then
you can remove the partition from the original drive and put that in the LVM
and then extend the new LV to include both raid arrays.

If you can manage to dump the 750 off to another drive you could do that and
just create a new RAID10 and then restore onto that also.


to directly answer your question, no you cannot have two pools.
--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] multiple pools

2009-09-25 Thread Michael Stowe
 is it possible to have multiple pools

Not without two instances of BackupPC, which is non-trivial.

 My backupspc has raid1 pair of 1Tb drives - this is now 750gb used

 I guess its time to think about adding a 2nd 1Tb pair

You may also want to think about extending the filesystem your pool sits
on instead of using completely separate mounts.

--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Periodoc Backup of the pool

2009-09-25 Thread Daniele Davolio
Yes, I was reading about that. I think that with the right configuration 
it can fit my needs.
Thanx! :)

Davide Brini wrote:
 On Friday 25 September 2009 15:03:59 Daniele Davolio wrote:

   
 Hi everyone,
 is there a way to do a periodic backup of the pool, like if I want to
 store a snapshot of all the backups and keep them on a safe.
 I'm quite new to Backuppc and I'm reading around about that.
 

 Could BackupPC's archive feature be what you're looking for?

 http://backuppc.sourceforge.net/faq/BackupPC.html#archive_functions

   


-- 
==
Daniele Davolio
Master Training S.r.l. - Information Technology Department
Sede Legale: via Timolini, N.18 Correggio (RE) - Italy
Sede Operativa: via Sani N.15 (Int.6) 42100 REGGIO EMILIA (RE)
Tel +39 0522 268059 - +39 0522 1846007
Fax +39 0522 331673
E-Mail d.davo...@mastertraining.it
E-Mail serviziotecn...@mastertraining.it
==


--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backing up a BackupPC host - *using rsync+tarPCCopy*

2009-09-25 Thread Steve
It might be nice if some of the repair tools you guys create were
shipped as part of the whole package, maybe even accessible through
Admin Options ...

just a suggestion.

steve

On Fri, Sep 25, 2009 at 10:47 AM, Jeffrey J. Kosowsky
backu...@kosowsky.org wrote:
 I have written a program BackupPC_fixLinks.pl that will go through
 both your pool and pc chain to find (and fix) duplicate pool files and
 missing links. Part of the fix is to run an equivalent to
 BackupPC_Link on missing/broken links.

 Note you can't just run BackupPC_Link (or its analog) unless you have
 a list of files to run it on, which is part of what my program does.

 I can send it to you or repost it if you are interested.

 Fernando Laudares Camargos wrote at about 08:56:35 -0400 on Friday, September 
 25, 2009:
   Hello,
  
   I beg your pardon for bringing this topic to the table again. I have read 
 this entire thread (even the scientific debate around thrusting probabilities 
 calculus) and have tried the main suggestions detailed by you to solve this 
 problem (except the use of raid mirror detailed by Les Mikesel, which does 
 not fit our needs).
  
   What worked best for us was the strategy of copying the 'cpool' with an 
 standard rsync (v.3) and then use BackupPC_tarPCCopy to re-create the sets of 
 backups. I have even refined that approach creating a script to break down 
 the rsync of the 'cpool' in multiple rsyncs (per sub-directory) and to 
 smartly run BackupPC_tarPCCopy for only the new backup sets of each pc.
  
   This approach works best in the environments we use Coraid boxes and that 
 we can mount both the regular BackupPC partition and the backup partition in 
 the same server, so as to BackupPC_tarPCCopy directly to the backup 
 partition. In the other cases we need to create a tar file with 
 BackupPC_tarPCCopy, copy it to the backup server over the network, and then 
 untar the file - that adds a new level of complexity to the solution.
  
   Anyway, the first step in this approach (rsync of cpool) works 
 considerably fine (for cpools containing a few Terabytes of data). What 
 doesn't always works fine is that BackupPC_tarPCCopy sometimes produces tar 
 files that are too big (several Gb, as for the backup of Zimbra and database 
 servers), which brings to the following question:
  
       * Why BackupPC_tarPCCopy sometimes produces big tar files ? *
  
   For what I understand, after the completion of a backup, BackupPC_link is 
 run to transfer all the non-BackupPC-system-files (e.g., attrib files) to the 
 'cpool', replacing then by hard links. A tar file originated from a 'linked' 
 backup set would then contain mainly a list of relations between the files 
 that compose the data set and their relative position in the cpool, so we can 
 use the same tool to untar the file and re-create the data set with hard 
 links. Which brings to my second and last question:
  
       * Is this (BackupPC_tarPCCopy creating big files) happening because 
 the interval between the execution of my backups (the end of one and the 
 start of the next one) is not giving enough time to BackupPC to run 
 BackupPC_link and finally the files in the data sets are not being 'linked' 
 in the cpool ? *
  
   Ideally I would like to have all files in the cpool so the rsync of this 
 directory would make for most of the trouble. If the second question is true 
 then it would be great to have a way to manually run BackupPC_link to clean 
 the backup sets.
  
   I appreciate your view in those two questions.
  
   Regards,
   --
   Fernando Laudares Camargos
  
         Révolution Linux
   http://www.revolutionlinux.com
   ---
   * Tout opinion et prise de position exprimée dans ce message est celle
   de son auteur et pas nécessairement celle de Révolution Linux.
   ** Any views and opinion presented in this e-mail are solely those of
   the author and do not necessarily represent those of Révolution Linux.
  
  
   Peter Walter a écrit :
    All,
   
    I have implemented backuppc on a Linux server in my mixed OSX / Windows
    / Linux environment for several months now, and I am very happy with the
    results. For additional disaster recovery protection, I am considering
    implementing an off-site backup of the backuppc server using rsync to
    synchronize the backup pool to a remote server. However, I have heard
    that in a previous release of backuppc, rsyncing to another server did
    not work because backuppc kept changing the file and directory names in
    the backup pool, leading the remote rsync server to having to
    re-transfer the entire backup pool (because it thinks the renamed files
    are new files).
   
    I have searched the wiki and the mailing list and can't find any
    discussion of this topic. Can anyone confirm that the way backuppc
    manages the files and directories in the backup pool would make it
    difficult to rsync to another server, and, if so, can 

Re: [BackupPC-users] Troubles with 2 Snow Leopard clients using tar over ssh

2009-09-25 Thread Barb Weston
I too am having problems with backups on Snow Leopard.  I am also  
using tar over ssh, and my error logs look pretty much the same as  
Jonathan's.  At the end it nicely says No files dumped for share /  
and shows the backup as failed.  It does, however, backup the new  
files I've added and I can restore them.

B-


Barb Weston
Department of Computer Science
University of California, Davis
wes...@cs.ucdavis.edu




On Sep 25, 2009, at 5:01:45  AM, jingai wrote:

 Backups to two of my Snow Leopard clients using tar over ssh are
 failing, while one other Snow Leopard client is successful using the
 same configuration.

 The error log is as follows:

 Running: /usr/bin/ssh -q -x -n -l root kuramori env LC_ALL=C /usr/bin/
 tar -c -v -f - -C / --totals --one-file-system --exclude=./Developer  
 --
 exclude=./Users/jingai/Movies --exclude=./Users/jingai/Music/iTunes/
 iTunes\\\ Music --exclude=./Users/jingai/tmp --exclude=pr0n --
 exclude=./Volumes --exclude=.Trash --exclude=./Trash --
 exclude=./.Trashes --exclude=./cores --exclude=.Spotlight-V100 --
 exclude=./automount --exclude=./Network --exclude=./private/var/
 automount --exclude=./private/var/run --exclude=./private/var/vm --
 exclude=./private/var/tmp --exclude=./private/tmp --exclude=Caches --
 exclude=CachedMessages ./
 full backup started for directory /
 Xfer PIDs are now 25906,25905
 [ ... SNIP FILE LISTING, SEEMS COMPLETE.. ]
 tarExtract: Done: 0 errors, 684376 filesExist, 30478227391 sizeExist,
 20789892662 sizeExistComp, 711718 filesTotal, 31326751741 sizeTotal
 Got fatal error during xfer (No files dumped for share /)
 Backup aborted (No files dumped for share /)
 Not saving this as a partial backup since it has fewer files than the
 prior one (got 711718 and 0 files versus 719791)

 I don't see where it's having problems.  Anyone can help?

 Thanks,
 Jonathan

 --
 Come build with us! The BlackBerryreg; Developer Conference in SF, CA
 is the only developer event you need to attend this year. Jumpstart  
 your
 developing skills, take BlackBerry mobile applications to market and  
 stay
 ahead of the curve. Join us from November 9#45;12, 2009. Register  
 now#33;
 http://p.sf.net/sfu/devconf
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/


--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Troubles with 2 Snow Leopard clients using tar over ssh

2009-09-25 Thread Craig Barratt
Barb writes:

 I too am having problems with backups on Snow Leopard.  I am also
 using tar over ssh, and my error logs look pretty much the same as
 Jonathan's.  At the end it nicely says No files dumped for share /
 and shows the backup as failed.  It does, however, backup the new
 files I've added and I can restore them.

Ok - several people have reported this now.  I need to either
find a Snow Leopard machine or provide some debugging code for
someone to run on their setup.  Any volunteers if I take the
second path?

Craig

--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Troubles with 2 Snow Leopard clients using tar over ssh

2009-09-25 Thread Steven Karel
could this have to do with the fact that OSX swtiched the default tar  
from gnutar to bsdtar with Snow Leopard?

http://discussions.apple.com/thread.jspa?threadID=2144311tstart=0

I think gnutar is still there, you might just have to change the  
arguments for calling.

sorry, I don't have snow leopard installed yet, but will soon.

On Sep 25, 2009, at 1:31 PM, Craig Barratt wrote:

 Barb writes:

 I too am having problems with backups on Snow Leopard.  I am also
 using tar over ssh, and my error logs look pretty much the same as
 Jonathan's.  At the end it nicely says No files dumped for share /
 and shows the backup as failed.  It does, however, backup the new
 files I've added and I can restore them.

 Ok - several people have reported this now.  I need to either
 find a Snow Leopard machine or provide some debugging code for
 someone to run on their setup.  Any volunteers if I take the
 second path?

 Craig

 --
 Come build with us! The BlackBerryreg; Developer Conference in SF, CA
 is the only developer event you need to attend this year. Jumpstart  
 your
 developing skills, take BlackBerry mobile applications to market and  
 stay
 ahead of the curve. Join us from November 9#45;12, 2009. Register  
 now#33;
 http://p.sf.net/sfu/devconf
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/


--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backing up a BackupPC host - *using rsync+tarPCCopy*

2009-09-25 Thread Fernando Laudares Camargos
Hi Les,

Les Mikesell a écrit :
 Fernando Laudares Camargos wrote:

 * Why BackupPC_tarPCCopy sometimes produces big tar files ? *
 
 Are you doing something to control the timing of your copies compared to 
 the changes backuppc would be making?  The files in cpool need to be 
 exactly in sync with the directory snapshots made by BackupPC_tarPCCopy.

I'm doing two things (altough I'm not sure that answer your question correctly):

1) rsync of cpool without --delete (so, cpool will keep growing, no files will 
ever be deleted. I assume that's fine apart from the fact it will take more 
disk space).
 
 It sounds like you are hitting at a time with there are still files in 
 the 'new' directories that don't have pool links yet, but I'm not sure 
 how you can tell when everything is linked.

2) I'm doing 'snapshots' of backup sets BackupPC_tarPCCopy that are 3-5 days 
old; I would expect that

I'm not sure how you can tell when everything is linked: exactly! Neither do 
I. But I have the feel that sometimes a backup set will be skipped from being 
linked completely, or at least that may happen for new files that are not yet 
in the cpool. That would explain my big tar files with BackupPC_tarPCCopy at 
least...

Thanks for your answer, I appreciate it.
-- 
Fernando Laudares Camargos

  Révolution Linux
http://www.revolutionlinux.com
---
* Tout opinion et prise de position exprimée dans ce message est celle
de son auteur et pas nécessairement celle de Révolution Linux.
** Any views and opinion presented in this e-mail are solely those of
the author and do not necessarily represent those of Révolution Linux.

--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backing up a BackupPC host - *using rsync+tarPCCopy*

2009-09-25 Thread Fernando Laudares Camargos
Hello Jeffrey,

Jeffrey J. Kosowsky a écrit :
 I have written a program BackupPC_fixLinks.pl that will go through
 both your pool and pc chain to find (and fix) duplicate pool files and
 missing links. Part of the fix is to run an equivalent to
 BackupPC_Link on missing/broken links.
 
 Note you can't just run BackupPC_Link (or its analog) unless you have
 a list of files to run it on, which is part of what my program does.

Yes, I have noted that and tried with no success to recreate the list based in 
the XferLOG.backup_set.z file - it's great news you have get it right.

 I can send it to you or repost it if you are interested.

I have found a version of it in:

http://www.backupcentral.com/phpBB2/two-way-mirrors-of-external-mailing-lists-3/backuppc-21/script-for-checking-fixing-missing-duplicated-broken-l-93932/

It dates from a year ago and says version 1.0. If this is the latest, I can 
make a copy from there. Otherwise, please sent it to me. I have take a briefly 
look at it; seems like you have put a lot of effort on it, I'm glad I can 
benefit from your hard work.

Regards,
-- 
Fernando Laudares Camargos

  Révolution Linux
http://www.revolutionlinux.com
---
* Tout opinion et prise de position exprimée dans ce message est celle
de son auteur et pas nécessairement celle de Révolution Linux.
** Any views and opinion presented in this e-mail are solely those of
the author and do not necessarily represent those of Révolution Linux.

 
 Fernando Laudares Camargos wrote at about 08:56:35 -0400 on Friday, September 
 25, 2009:
   Hello,
   
   I beg your pardon for bringing this topic to the table again. I have read 
 this entire thread (even the scientific debate around thrusting probabilities 
 calculus) and have tried the main suggestions detailed by you to solve this 
 problem (except the use of raid mirror detailed by Les Mikesel, which does 
 not fit our needs).
   
   What worked best for us was the strategy of copying the 'cpool' with an 
 standard rsync (v.3) and then use BackupPC_tarPCCopy to re-create the sets of 
 backups. I have even refined that approach creating a script to break down 
 the rsync of the 'cpool' in multiple rsyncs (per sub-directory) and to 
 smartly run BackupPC_tarPCCopy for only the new backup sets of each pc.
   
   This approach works best in the environments we use Coraid boxes and that 
 we can mount both the regular BackupPC partition and the backup partition in 
 the same server, so as to BackupPC_tarPCCopy directly to the backup 
 partition. In the other cases we need to create a tar file with 
 BackupPC_tarPCCopy, copy it to the backup server over the network, and then 
 untar the file - that adds a new level of complexity to the solution.
   
   Anyway, the first step in this approach (rsync of cpool) works 
 considerably fine (for cpools containing a few Terabytes of data). What 
 doesn't always works fine is that BackupPC_tarPCCopy sometimes produces tar 
 files that are too big (several Gb, as for the backup of Zimbra and database 
 servers), which brings to the following question:
   
   * Why BackupPC_tarPCCopy sometimes produces big tar files ? *
   
   For what I understand, after the completion of a backup, BackupPC_link is 
 run to transfer all the non-BackupPC-system-files (e.g., attrib files) to the 
 'cpool', replacing then by hard links. A tar file originated from a 'linked' 
 backup set would then contain mainly a list of relations between the files 
 that compose the data set and their relative position in the cpool, so we can 
 use the same tool to untar the file and re-create the data set with hard 
 links. Which brings to my second and last question:
   
   * Is this (BackupPC_tarPCCopy creating big files) happening because 
 the interval between the execution of my backups (the end of one and the 
 start of the next one) is not giving enough time to BackupPC to run 
 BackupPC_link and finally the files in the data sets are not being 'linked' 
 in the cpool ? *
   
   Ideally I would like to have all files in the cpool so the rsync of this 
 directory would make for most of the trouble. If the second question is true 
 then it would be great to have a way to manually run BackupPC_link to clean 
 the backup sets.
   
   I appreciate your view in those two questions.
   
   Regards,
   -- 
   Fernando Laudares Camargos
   
 Révolution Linux
   http://www.revolutionlinux.com
   ---
   * Tout opinion et prise de position exprimée dans ce message est celle
   de son auteur et pas nécessairement celle de Révolution Linux.
   ** Any views and opinion presented in this e-mail are solely those of
   the author and do not necessarily represent those of Révolution Linux.
   
   
   Peter Walter a écrit :
All,

I have implemented backuppc on a Linux server in my mixed OSX / Windows 
/ Linux environment for several months now, and I am very happy 

Re: [BackupPC-users] Troubles with 2 Snow Leopard clients using tar over ssh

2009-09-25 Thread Michael Stowe

I have Snow Leopard machines, and I don't mind trying out debugging code
-- though I use the rsync method and haven't had a moment's trouble.

 Barb writes:

 I too am having problems with backups on Snow Leopard.  I am also
 using tar over ssh, and my error logs look pretty much the same as
 Jonathan's.  At the end it nicely says No files dumped for share /
 and shows the backup as failed.  It does, however, backup the new
 files I've added and I can restore them.

 Ok - several people have reported this now.  I need to either
 find a Snow Leopard machine or provide some debugging code for
 someone to run on their setup.  Any volunteers if I take the
 second path?

 Craig


--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] OT: (e.g.) sed command to modify configuration file

2009-09-25 Thread Timothy J Massey
Hello!

I have a shell script that I use to install BackupPC.  It takes a standard 
CentOS installation and performs the configuration that I would normally 
do to install BackupPC.  There are probably way better ways of doing this, 
but this is the way I've chosen.

As part of this script, I use sed to modify certain configuration files. 
My sed-fu is weak, however, and I've only gotten it to do the most basic 
things:  insert static text immediately after a simple string match.  For 
example, something like this:

sed -i.org 's/^[ #]*PermitRootLogin *.*$/#\nPermitRootLogin no/' 
/etc/ssh/sshd_config

What I'm trying to do is search a configuration file for zero or more 
occurrences of a particular configuration element (either commented out or 
not), prepend a # to all of them (again, commented out or not), and append 
the proper configuration line.  The line above works under extremely 
narrow circumstances, but it's very fragile.  Does anyone have a good way 
to do this (sed or otherwise) from within a (bash) shell script?

That's my question.  If you already know the answer, then stop reading 
here and e-mail me the solution!  :)  Otherwise, here's an example of what 
I'm looking for:

Here is a sample configuration file simplified from sshd_config:

#Example of a greatly reduced sshd_config
#Protocol 2,1
Protocol 2
#Additional lines here
#PermitRootLogin yes
#Additional lines here


I want to alter this in two ways:
1) comment out all Protocol lines and add a line Protocol 2.  (Yes, I 
know it already says this.  Pretend that I want Protocol 1, if it helps.)
2) comment out all PermitRootLogin lines and add a PermitRootLogin no

In the end, I'd like to see this:

#Example of a greatly reduced sshd_config
##Protocol 2,1
#Protocol 2
Protocol 2
#Additional lines here
##PermitRootLogin yes
PermitRootLogin no
#Additional lines here

With the sed line I've outlined at the top, it will add a # to the 
beginning of *every* e.g. PermitRootLogin line and add the proper line 
right below that.  It only works right now because there's only one 
PermitRootLogin line.  But it falls down terribly if there are more than 
one, such as with the Protocol line.  AFAICT, there's no way to tell sed 
to either add text only at the last match (which I can understand, it's 
hard to know if it will be the last match until the end, and by then it's 
too late), or to stop editing after the first match and merely dump the 
rest of the file into the output.  Without being able to do either of 
these things, I'm stuck...

Ideas?

Tim Massey


--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Troubles with 2 Snow Leopard clients using tar over ssh

2009-09-25 Thread jingai
On Sep 25, 2009, at 1:31 PM, Craig Barratt wrote:

 Barb writes:

 I too am having problems with backups on Snow Leopard.  I am also
 using tar over ssh, and my error logs look pretty much the same as
 Jonathan's.  At the end it nicely says No files dumped for share /
 and shows the backup as failed.  It does, however, backup the new
 files I've added and I can restore them.

 Ok - several people have reported this now.  I need to either
 find a Snow Leopard machine or provide some debugging code for
 someone to run on their setup.  Any volunteers if I take the
 second path?

I'd be happy to help.

-j

--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] OT: (e.g.) sed command to modify configuration file

2009-09-25 Thread Jim Wilcoxson
Well, that didn't work - Gmail broke the line.  There should be a
space after the #, so that any number of spaces and hashes will be
removed.

Jim

On 9/25/09, Jim Wilcoxson pri...@gmail.com wrote:
 Try this:

 [...@amd backup]$ cat x
 #Example of a greatly reduced sshd_config
 #Protocol 2,1
 Protocol 2
 #Additional lines here
 #PermitRootLogin yes
 #Additional lines here

 [...@amd backup]$ sed 's/[#
 ]*Protocol/Protocol/;s/Protocol/#Protocol/;$aProtocol jw' x
 #Example of a greatly reduced sshd_config
 #Protocol 2,1
 #Protocol 2
 #Additional lines here
 #PermitRootLogin yes
 #Additional lines here
 Protocol jw

 [...@amd backup]$

 The first s/ uncomments all Protocol lines, the next comments them,
 the last command appends your own Protocol command.

 Jim
 http://sites.google.com/site/hashbackup



 On 9/25/09, Timothy J Massey tmas...@obscorp.com wrote:
 Hello!

 I have a shell script that I use to install BackupPC.  It takes a
 standard
 CentOS installation and performs the configuration that I would normally
 do to install BackupPC.  There are probably way better ways of doing
 this,
 but this is the way I've chosen.

 As part of this script, I use sed to modify certain configuration files.
 My sed-fu is weak, however, and I've only gotten it to do the most basic
 things:  insert static text immediately after a simple string match.  For
 example, something like this:

 sed -i.org 's/^[ #]*PermitRootLogin *.*$/#\nPermitRootLogin no/'
 /etc/ssh/sshd_config

 What I'm trying to do is search a configuration file for zero or more
 occurrences of a particular configuration element (either commented out
 or
 not), prepend a # to all of them (again, commented out or not), and
 append
 the proper configuration line.  The line above works under extremely
 narrow circumstances, but it's very fragile.  Does anyone have a good way
 to do this (sed or otherwise) from within a (bash) shell script?

 That's my question.  If you already know the answer, then stop reading
 here and e-mail me the solution!  :)  Otherwise, here's an example of
 what
 I'm looking for:

 Here is a sample configuration file simplified from sshd_config:

 #Example of a greatly reduced sshd_config
 #Protocol 2,1
 Protocol 2
 #Additional lines here
 #PermitRootLogin yes
 #Additional lines here


 I want to alter this in two ways:
 1) comment out all Protocol lines and add a line Protocol 2.  (Yes, I
 know it already says this.  Pretend that I want Protocol 1, if it helps.)
 2) comment out all PermitRootLogin lines and add a PermitRootLogin no

 In the end, I'd like to see this:

 #Example of a greatly reduced sshd_config
 ##Protocol 2,1
 #Protocol 2
 Protocol 2
 #Additional lines here
 ##PermitRootLogin yes
 PermitRootLogin no
 #Additional lines here

 With the sed line I've outlined at the top, it will add a # to the
 beginning of *every* e.g. PermitRootLogin line and add the proper line
 right below that.  It only works right now because there's only one
 PermitRootLogin line.  But it falls down terribly if there are more than
 one, such as with the Protocol line.  AFAICT, there's no way to tell sed
 to either add text only at the last match (which I can understand, it's
 hard to know if it will be the last match until the end, and by then it's
 too late), or to stop editing after the first match and merely dump the
 rest of the file into the output.  Without being able to do either of
 these things, I'm stuck...

 Ideas?

 Tim Massey


 --
 Come build with us! The BlackBerryreg; Developer Conference in SF, CA
 is the only developer event you need to attend this year. Jumpstart your
 developing skills, take BlackBerry mobile applications to market and stay
 ahead of the curve. Join us from November 9#45;12, 2009. Register
 now#33;
 http://p.sf.net/sfu/devconf
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/



--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] OT: (e.g.) sed command to modify configuration file

2009-09-25 Thread Jim Wilcoxson
Try this:

[...@amd backup]$ cat x
#Example of a greatly reduced sshd_config
#Protocol 2,1
Protocol 2
#Additional lines here
#PermitRootLogin yes
#Additional lines here

[...@amd backup]$ sed 's/[#
]*Protocol/Protocol/;s/Protocol/#Protocol/;$aProtocol jw' x
#Example of a greatly reduced sshd_config
#Protocol 2,1
#Protocol 2
#Additional lines here
#PermitRootLogin yes
#Additional lines here
Protocol jw

[...@amd backup]$

The first s/ uncomments all Protocol lines, the next comments them,
the last command appends your own Protocol command.

Jim
http://sites.google.com/site/hashbackup



On 9/25/09, Timothy J Massey tmas...@obscorp.com wrote:
 Hello!

 I have a shell script that I use to install BackupPC.  It takes a standard
 CentOS installation and performs the configuration that I would normally
 do to install BackupPC.  There are probably way better ways of doing this,
 but this is the way I've chosen.

 As part of this script, I use sed to modify certain configuration files.
 My sed-fu is weak, however, and I've only gotten it to do the most basic
 things:  insert static text immediately after a simple string match.  For
 example, something like this:

 sed -i.org 's/^[ #]*PermitRootLogin *.*$/#\nPermitRootLogin no/'
 /etc/ssh/sshd_config

 What I'm trying to do is search a configuration file for zero or more
 occurrences of a particular configuration element (either commented out or
 not), prepend a # to all of them (again, commented out or not), and append
 the proper configuration line.  The line above works under extremely
 narrow circumstances, but it's very fragile.  Does anyone have a good way
 to do this (sed or otherwise) from within a (bash) shell script?

 That's my question.  If you already know the answer, then stop reading
 here and e-mail me the solution!  :)  Otherwise, here's an example of what
 I'm looking for:

 Here is a sample configuration file simplified from sshd_config:

 #Example of a greatly reduced sshd_config
 #Protocol 2,1
 Protocol 2
 #Additional lines here
 #PermitRootLogin yes
 #Additional lines here


 I want to alter this in two ways:
 1) comment out all Protocol lines and add a line Protocol 2.  (Yes, I
 know it already says this.  Pretend that I want Protocol 1, if it helps.)
 2) comment out all PermitRootLogin lines and add a PermitRootLogin no

 In the end, I'd like to see this:

 #Example of a greatly reduced sshd_config
 ##Protocol 2,1
 #Protocol 2
 Protocol 2
 #Additional lines here
 ##PermitRootLogin yes
 PermitRootLogin no
 #Additional lines here

 With the sed line I've outlined at the top, it will add a # to the
 beginning of *every* e.g. PermitRootLogin line and add the proper line
 right below that.  It only works right now because there's only one
 PermitRootLogin line.  But it falls down terribly if there are more than
 one, such as with the Protocol line.  AFAICT, there's no way to tell sed
 to either add text only at the last match (which I can understand, it's
 hard to know if it will be the last match until the end, and by then it's
 too late), or to stop editing after the first match and merely dump the
 rest of the file into the output.  Without being able to do either of
 these things, I'm stuck...

 Ideas?

 Tim Massey


 --
 Come build with us! The BlackBerryreg; Developer Conference in SF, CA
 is the only developer event you need to attend this year. Jumpstart your
 developing skills, take BlackBerry mobile applications to market and stay
 ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
 http://p.sf.net/sfu/devconf
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/


--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Switching backup methods

2009-09-25 Thread jingai
Is it OK to switch between backup methods at any time, or do I need to  
do something beforehand?

I'm trying rsync instead of tar again on one of my Snow Leopard  
clients to see if it works better.  Quite some time ago, I found it to  
be considerably slower, but that may have changed.

-Jonathan

--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] OT: (e.g.) sed command to modify configuration file

2009-09-25 Thread Davide Brini
On Friday 25 September 2009, Timothy J Massey wrote:
 Hello!

 I have a shell script that I use to install BackupPC.  It takes a standard
 CentOS installation and performs the configuration that I would normally
 do to install BackupPC.  There are probably way better ways of doing this,
 but this is the way I've chosen.

 As part of this script, I use sed to modify certain configuration files.
 My sed-fu is weak, however, and I've only gotten it to do the most basic
 things:  insert static text immediately after a simple string match.  For
 example, something like this:

 sed -i.org 's/^[ #]*PermitRootLogin *.*$/#\nPermitRootLogin no/'
 /etc/ssh/sshd_config

 What I'm trying to do is search a configuration file for zero or more
 occurrences of a particular configuration element (either commented out or
 not), prepend a # to all of them (again, commented out or not), and append
 the proper configuration line.  The line above works under extremely
 narrow circumstances, but it's very fragile.  Does anyone have a good way
 to do this (sed or otherwise) from within a (bash) shell script?

 That's my question.  If you already know the answer, then stop reading
 here and e-mail me the solution!  :)  Otherwise, here's an example of what
 I'm looking for:

 Here is a sample configuration file simplified from sshd_config:

 #Example of a greatly reduced sshd_config
 #Protocol 2,1
 Protocol 2
 #Additional lines here
 #PermitRootLogin yes
 #Additional lines here


 I want to alter this in two ways:
 1) comment out all Protocol lines and add a line Protocol 2.  (Yes, I
 know it already says this.  Pretend that I want Protocol 1, if it helps.)
 2) comment out all PermitRootLogin lines and add a PermitRootLogin no

 In the end, I'd like to see this:

 #Example of a greatly reduced sshd_config
 ##Protocol 2,1
 #Protocol 2
 Protocol 2
 #Additional lines here
 ##PermitRootLogin yes
 PermitRootLogin no
 #Additional lines here

 With the sed line I've outlined at the top, it will add a # to the
 beginning of *every* e.g. PermitRootLogin line and add the proper line
 right below that.  It only works right now because there's only one
 PermitRootLogin line.  But it falls down terribly if there are more than
 one, such as with the Protocol line.  AFAICT, there's no way to tell sed
 to either add text only at the last match (which I can understand, it's
 hard to know if it will be the last match until the end, and by then it's
 too late), or to stop editing after the first match and merely dump the
 rest of the file into the output.  Without being able to do either of
 these things, I'm stuck...

What you can do is to blindly comment out all lines you don't want, and when 
you're at the last line, add your lines. Something like this:

sed 's/^Protocol/#/;$s/$/\nProtocol 2/'

(note that the above assumes you're using a sed that recognizes \n in 
replacements, line GNU sed)

This comments out all lines starting with Protocol, and appends Protocol 2 
after the very last line.

It can be extended to comment out different types of lines, for example 
suppose you want to add three lines:

Protocol 2
PermitRootLogin no
PrintLastLog yes

So you can do something like this (put this in a file, and run it with sed -f 
program.sed):

s/^Protocol/#/
s/^PermitRootLogin/#/
s/^PrintLastLog/#/
$s/$/\nProtocol 2\nPermitRootLogin no\nPrintLastLog yes/

If you're using GNU sed as it's likely since you mention CentOS, you can 
exploit its extended regex capabilities and do, in a more compact way:

# invoke with sed -r
s/^(Protocol|PermitRootLogin|PrintLastLog)/#/
$s/$/\nProtocol 2\nPermitRootLogin no\nPrintLastLog yes/

Keep in mind, though, that at some point (for example if you require some 
other complex modification/customization) the problem becomes too complicated 
to be easily solved with tools like sed alone.

Hope that helps.

-- 
D.

--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backing up a BackupPC host - *using rsync+tarPCCopy*

2009-09-25 Thread Les Mikesell
Fernando Laudares Camargos wrote:

 Fernando Laudares Camargos wrote:
 * Why BackupPC_tarPCCopy sometimes produces big tar files ? *
 Are you doing something to control the timing of your copies compared to 
 the changes backuppc would be making?  The files in cpool need to be 
 exactly in sync with the directory snapshots made by BackupPC_tarPCCopy.
 
 I'm doing two things (altough I'm not sure that answer your question 
 correctly):
 
 1) rsync of cpool without --delete (so, cpool will keep growing, no files 
 will ever be deleted. I assume that's fine apart from the fact it will take 
 more disk space).

BackupPC_nightly may rename chains of hash collisions in cpool as part of its 
cleanup.  If such a rename occurs between the rsync runs and the 
BackupPC_tarPCCopy or restore, you'll end up with links to the wrong files.


-- 
   Les Mikesell
lesmikes...@gmail.com


--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/