[BackupPC-users] Unable to call Host-Summary in current beta

2009-10-08 Thread Stefan Jurisch
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Hello together,

some days ago I installed the current beta0 onto a fresh OpenSUSE 11.1
(unmodified standard installation) and started some backup tasks.
It did its work just complete and fine, but now I want to look at the
host summary and there does not show up anything else, except of an
HTTP500 Internal Server Error!

The question is Why, because it worked without errors several days and
I did not modify antything.

Can someone give any advice?

Thanks in advance.

Regards,
Stefan



- --

S T E F A N   J U R I S C H
- 
System Engineer - VMware Support

SIEGNETZ.IT GmbH
Schneppenkauten 1a
DE-57076 Siegen

Tel. +49 271 68193- 0
Fax: +49 271 68193-28

http://www.siegnetz.de


Amtsgericht Siegen HRB4838
Geschäftsführer: Oliver Seitz
Sitz der Gesellschaft ist Siegen

- 

Das Wort WINDOWS stammt aus
einem alten Sioux-Dialekt und
bedeutet:
Weißer Mann starrt durch
Glasscheibe auf Sanduhr.

- 


-BEGIN PGP SIGNATURE-
Version: GnuPG v2.0.9 (GNU/Linux)
Comment: Using GnuPG with SUSE - http://enigmail.mozdev.org

iEYEARECAAYFAkrNgnoACgkQqdb99cbyCz5pyACghF1iGj0wov3gbhGUX/nZRG20
dGUAoI/vkHA3So9e23gJjnOLr4P2Dt3W
=wlh2
-END PGP SIGNATURE-

-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.


--
Come build with us! The BlackBerry(R) Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9 - 12, 2009. Register now!
http://p.sf.net/sfu/devconference
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Pooling doesn't work

2009-10-08 Thread Patric Hafner
Hello everybody,

i'm running BackupPC 3.1.0 with Debian Lenny. BackupPC is responsible
for about 5 Clients which are backupped over rsync/ssh.

My problem is, that during an incremental backup nearly every file is
marked as create, so nearly every file will be downloaded again.
About 20% are marked as pool.
But those files marked as create haven't changed since the last run,
timestamps are still the same. For example the whole /etc directory will
be downloaded every day. And I can surely say that nothing changed there.

This results an an extensive amount of traffic, which is unacceptable.
For example: My old, handwritten rsync-based solution found around 50MB
of changed files, but BackupPC 3000MB.

An example
File Size/Count Reuse Summary
Totals  Existing Files  New Files
Backup# Type#Files  Size/MB MB/sec  #Files  Size/MB 
#Files  Size/MB

35  incr996 3560.5  1.69710 178.9   402 3381.7

Does anyone has an idea? This would be great.

Big thanks for your help.

Best regards,

Patric

--
Come build with us! The BlackBerry(R) Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9 - 12, 2009. Register now!
http://p.sf.net/sfu/devconference
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Pooling doesn't work

2009-10-08 Thread Holger Parplies
Hi,

Patric Hafner wrote on 2009-10-08 16:29:54 +0200 [[BackupPC-users] Pooling 
doesn't work]:
 i'm running BackupPC 3.1.0 with Debian Lenny. BackupPC is responsible
 for about 5 Clients which are backupped over rsync/ssh.
 
 My problem is, that during an incremental backup nearly every file is
 marked as create, so nearly every file will be downloaded again.
 About 20% are marked as pool.
 But those files marked as create haven't changed since the last run,
 timestamps are still the same. For example the whole /etc directory will
 be downloaded every day. And I can surely say that nothing changed there.
 [...]
 Does anyone has an idea? This would be great.

yes, you are probably incorrectly using incremental backups, but since you
don't say anything about your configuration, we can only guess.

Level 1 incremental backups download everything that has changed since the
last full backup. Presuming your last full was long ago, or you have modified
your configuration since then (e.g. changed from a test backup of, say, /lib,
to a full backup of all of your root file system), you will be downloading
everything changed or added since the last full backup with every incremental.

Run a full backup and see if the following incrementals behave better. If so,
send us some details about your configuration (esp. full and incremental
backup scheduling settings) to let us help you adjust your schedule. In short:
you *need* regular full backups.

Regards,
Holger

--
Come build with us! The BlackBerry(R) Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9 - 12, 2009. Register now!
http://p.sf.net/sfu/devconference
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Pooling doesn't work

2009-10-08 Thread Holger Parplies
Hi again,

I was just going to add that your subject is incorrect, but I see that you
seem to be having a second issue. Sorry for replying a bit hastily, but your
wording does make it rather easy to draw incorrect conclusions (or rather miss
essential points).

Patric Hafner wrote on 2009-10-08 16:29:54 +0200 [[BackupPC-users] Pooling 
doesn't work]:
 [...]
 My problem is, that during an incremental backup nearly every file is
 marked as create, so nearly every file will be downloaded again.
 About 20% are marked as pool.

Note that pool also means downloaded again. Not downloaded due to rsync
savings is same (in a full backup) or the file simply not appearing in the
log (in an incremental backup).

 But those files marked as create haven't changed since the last run,
 timestamps are still the same. For example the whole /etc directory will
 be downloaded every day. And I can surely say that nothing changed there.

Timestamps are not the only indication of change. It is *possible* to modify a
file without changing the timestamp (e.g. resetting it after the change). But
that is probably not what is happening here.

It would appear that pooling is only *partially* working (which is confusing
in itself). You couldn't have files marked pool if there was no pooling at
all. I would *guess* that you have probably incorrectly changed $TopDir after
having made some backups. You probably have tons of link failed ... errors
in your log files. New files are not added to the pool, so only files already
present from your first backups would be found there, though linking would not
work for them, either.

Again, for anything more than educated guesses about what might be going
wrong, we need details about your setup.

- What version, what installation method, what OS, which paths, which
  filesystem(s), how partitioned?
- What did you change recently? Move $TopDir? How? See
  
http://sourceforge.net/apps/mediawiki/backuppc/index.php?title=Change_archive_directory
  for details on what you should or should not have done.
- Is there anything suspicious in your log files ($LogDir/LOG and
  $TopDir/pc/hostname/XferLOG.NN.z)?

 This results an an extensive amount of traffic, which is unacceptable.

Err, no. It results in an excessive amount of storage being used. Traffic is
independant of storage. If pooling was working correctly, you could still have
the same amount of traffic, but everything should be marked pool and stored
only once. Conversely, if rsync transfers were working correctly, you would
save traffic, but that does not imply that pooling would work. True, for this
one host unchanged files would be re-used, but they would not be matched up
against independent copies of identical content (from the same host or
different hosts).

You need to fix both issues, and they are independant of each other.

Regards,
Holger

--
Come build with us! The BlackBerry(R) Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9 - 12, 2009. Register now!
http://p.sf.net/sfu/devconference
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Pooling doesn't work

2009-10-08 Thread Patric Hafner
Hi,

thanks for replying and explaining. This made some things clearer for me.

Holger Parplies schrieb:
 Hi again,
 
 I was just going to add that your subject is incorrect, but I see that you
 seem to be having a second issue. Sorry for replying a bit hastily, but your
 wording does make it rather easy to draw incorrect conclusions (or rather miss
 essential points).
I'm sorry. My English isn't as that good as it should be.

 
 Patric Hafner wrote on 2009-10-08 16:29:54 +0200 [[BackupPC-users] Pooling 
 doesn't work]:
 [...]
 My problem is, that during an incremental backup nearly every file is
 marked as create, so nearly every file will be downloaded again.
 About 20% are marked as pool.
 
 Note that pool also means downloaded again. Not downloaded due to rsync
 savings is same (in a full backup) or the file simply not appearing in the
 log (in an incremental backup).
Okay, I understand.

 
 But those files marked as create haven't changed since the last run,
 timestamps are still the same. For example the whole /etc directory will
 be downloaded every day. And I can surely say that nothing changed there.
 
 Timestamps are not the only indication of change. It is *possible* to modify a
 file without changing the timestamp (e.g. resetting it after the change). But
 that is probably not what is happening here.

Okay, but why assumes rsync that one day after performing a full backup,
that some files changed which a definitively not changed.
I did the following to exclude changes: I setup a new linux client
created some directories in /home/.  After that I started a full backup.
  One day later I performed an incremental backup. The log says that the
whole /etc and /home directory is set to create. Why this behavior?

I guess pooling works, and my main problem is: Too many files are marked
unexplainable as create.


 
 It would appear that pooling is only *partially* working (which is confusing
 in itself). You couldn't have files marked pool if there was no pooling at
 all. I would *guess* that you have probably incorrectly changed $TopDir after
 having made some backups. You probably have tons of link failed ... errors
 in your log files. New files are not added to the pool, so only files already
 present from your first backups would be found there, though linking would not
 work for them, either.
 
 Again, for anything more than educated guesses about what might be going
 wrong, we need details about your setup.
 
 - What version, what installation method, what OS, which paths, which
   filesystem(s), how partitioned?

BackupPC 3.1.0 that comes with Debian Lenny. One root partition (ext3,
/, normal SATA drive), one data partition (ext3, /data, external RAID 5
array).

 - What did you change recently? Move $TopDir? How? See
   
 http://sourceforge.net/apps/mediawiki/backuppc/index.php?title=Change_archive_directory
   for details on what you should or should not have done.

I created a soft-link during installation: /var/lib/backuppc -
/data/backuppc/. No backup was made before creating this soft-link.

 - Is there anything suspicious in your log files ($LogDir/LOG and
   $TopDir/pc/hostname/XferLOG.NN.z)?

Everything fine. No link failed errors.

 
 This results an an extensive amount of traffic, which is unacceptable.
 
 Err, no. It results in an excessive amount of storage being used. Traffic is
 independant of storage. If pooling was working correctly, you could still have
 the same amount of traffic, but everything should be marked pool and stored
 only once. Conversely, if rsync transfers were working correctly, you would
 save traffic, but that does not imply that pooling would work. True, for this
 one host unchanged files would be re-used, but they would not be matched up
 against independent copies of identical content (from the same host or
 different hosts).

Thanks for explaining.

 i'm running BackupPC 3.1.0 with Debian Lenny. BackupPC is responsible
 for about 5 Clients which are backupped over rsync/ssh.

 My problem is, that during an incremental backup nearly every file is
 marked as create, so nearly every file will be downloaded again.
 About 20% are marked as pool.
 But those files marked as create haven't changed since the last run,
 timestamps are still the same. For example the whole /etc directory will
 be downloaded every day. And I can surely say that nothing changed there.
 [...]
 Does anyone has an idea? This would be great.
 
 yes, you are probably incorrectly using incremental backups, but since you
 don't say anything about your configuration, we can only guess.

Configuration details see above.


 Level 1 incremental backups download everything that has changed since the
 last full backup. Presuming your last full was long ago, or you have modified
 your configuration since then (e.g. changed from a test backup of, say, /lib,
 to a full backup of all of your root file system), you will be downloading
 everything changed or added since the last full backup with every 

[BackupPC-users] Help with existing BackupPC installation

2009-10-08 Thread Terence Berendt
I need some help and hope this email is not considered spamming and that I'm on 
the correct list.
I have inherited a BackupPC installation from a previous administrator who is 
no longer with the company.  Unfortunately, I am not at all familiar with the 
software or even Linux (the host) for that matter.  Everything was going well 
and I found the interface easy to use until recently when the hard drive filled 
up and we can no longer back up.
The higher ups in the company want to move the old data to another location, 
such as a USB drive and then adjust the backup so that it only retains data for 
a more reasonable amount of time.  I looked at the documentation and cannot 
figure out how to setup an archive properly (assuming that's what I really want 
to do).
I hope someone can take pity on this newbie and give me some tips/help.

Terence Berendt
Link High Technologies
321 Palmer Road, Denville, NJ 07834
(973) 659-1350 x 112
www.LinkHigh.comhttp://www.LinkHigh.com

Link High Technologies: providing managed services, IT solutions, and computer 
network services for companies, municipalities, and schools of all sizes in NY, 
NJ  PA since 1992. Learn more at www.linkhigh.com

Technology Partners: Microsoft | Barracuda | ArcMail | Xilocore | Sonicwall | 
Allworx |


--
Come build with us! The BlackBerry(R) Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9 - 12, 2009. Register now!
http://p.sf.net/sfu/devconference___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Help with existing BackupPC installation

2009-10-08 Thread Carl Wilhelm Soderstrom
On 10/08 02:21 , Terence Berendt wrote:
 I need some help and hope this email is not considered spamming and that I'm 
 on the correct list.
 I have inherited a BackupPC installation from a previous administrator who is 
 no longer with the company.  Unfortunately, I am not at all familiar with the 
 software or even Linux (the host) for that matter.  Everything was going well 
 and I found the interface easy to use until recently when the hard drive 
 filled up and we can no longer back up.
 The higher ups in the company want to move the old data to another location, 
 such as a USB drive and then adjust the backup so that it only retains data 
 for a more reasonable amount of time.  I looked at the documentation and 
 cannot figure out how to setup an archive properly (assuming that's what I 
 really want to do).
 I hope someone can take pity on this newbie and give me some tips/help.

Is this BackupPC v3.0 or higher? i.e. do you have a web interface for
editing the configuration files, or do you have to do that by hand?

You may want to try limiting the amount of stuff you back up, in order to
make space.

Because of BackupPC's file-deduplicating feature ('pooling'); removing older
backups doesn't save as much space as you think it might, except on very
busy hosts (i.e. ones where much of the data changes a lot).

If the install is from a Debian or Ubuntu package, you'll find the data
under /var/lib/backuppc/pc/hostname. If you try deleting any of the
subdirectories under there; you'll find that you have to wait a night (or
possibly more, depending on settings) for the space to be freed, unless you
run the BackupPC_nightly job by hand. There should be instructions somewhere
in the archives of this mailing list about how to do this properly.

copying data to an external drive is an awkward proceedure; and far from
polished yet. consult the archives (use google to search, not sourceforge's
search tool) for some discussion of this. If you're still stymied, ask again
and perhaps someone can give more guidance.

If you want to understand backuppc; just read /etc/backuppc/config.pl end to
end. The comments in there should explain 90% of it.

-- 
Carl Soderstrom
Systems Administrator
Real-Time Enterprises
www.real-time.com

--
Come build with us! The BlackBerry(R) Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9 - 12, 2009. Register now!
http://p.sf.net/sfu/devconference
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Help with existing BackupPC installation

2009-10-08 Thread Terence Berendt
Thanks for your help.
Yes, it is 3.1 with a web interface.  Otherwise with my lack of Linux 
knowledge, I wouldn't have gotten this far.
It seems that, once we copy the old items off the server, we can just change 
the retention policy and they will be automatically removed if I am 
understanding things correctly.

-Original Message-
From: Carl Wilhelm Soderstrom [mailto:chr...@real-time.com] 
Sent: Thursday, October 08, 2009 2:59 PM
To: General list for user discussion, questions and support
Subject: Re: [BackupPC-users] Help with existing BackupPC installation

On 10/08 02:21 , Terence Berendt wrote:
 I need some help and hope this email is not considered spamming and that I'm 
 on the correct list.
 I have inherited a BackupPC installation from a previous administrator who is 
 no longer with the company.  Unfortunately, I am not at all familiar with the 
 software or even Linux (the host) for that matter.  Everything was going well 
 and I found the interface easy to use until recently when the hard drive 
 filled up and we can no longer back up.
 The higher ups in the company want to move the old data to another location, 
 such as a USB drive and then adjust the backup so that it only retains data 
 for a more reasonable amount of time.  I looked at the documentation and 
 cannot figure out how to setup an archive properly (assuming that's what I 
 really want to do).
 I hope someone can take pity on this newbie and give me some tips/help.

Is this BackupPC v3.0 or higher? i.e. do you have a web interface for
editing the configuration files, or do you have to do that by hand?

You may want to try limiting the amount of stuff you back up, in order to
make space.

Because of BackupPC's file-deduplicating feature ('pooling'); removing older
backups doesn't save as much space as you think it might, except on very
busy hosts (i.e. ones where much of the data changes a lot).

If the install is from a Debian or Ubuntu package, you'll find the data
under /var/lib/backuppc/pc/hostname. If you try deleting any of the
subdirectories under there; you'll find that you have to wait a night (or
possibly more, depending on settings) for the space to be freed, unless you
run the BackupPC_nightly job by hand. There should be instructions somewhere
in the archives of this mailing list about how to do this properly.

copying data to an external drive is an awkward proceedure; and far from
polished yet. consult the archives (use google to search, not sourceforge's
search tool) for some discussion of this. If you're still stymied, ask again
and perhaps someone can give more guidance.

If you want to understand backuppc; just read /etc/backuppc/config.pl end to
end. The comments in there should explain 90% of it.

-- 
Carl Soderstrom
Systems Administrator
Real-Time Enterprises
www.real-time.com

--
Come build with us! The BlackBerry(R) Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9 - 12, 2009. Register now!
http://p.sf.net/sfu/devconference
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/

--
Come build with us! The BlackBerry(R) Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9 - 12, 2009. Register now!
http://p.sf.net/sfu/devconference
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] 34851 files processed yet 0 files dumped for share

2009-10-08 Thread Lorrin Nelson
My issue was resolved by switching from bsdtar to gnutar as mentioned  
in the Troubles with 2 Snow Leopard clients using tar over ssh thread.

Cheers
-Lorrin

On Sep 24, 2009, at 10:31 PM, Lorrin Nelson wrote:

 On Sep 22, 2009, at 10:25 PM, Craig Barratt wrote:

 Lorrin writes:

 tarExtract: Done: 1 errors, 30960 filesExist, 15823490040 sizeExist,
 15823490040 sizeExistComp, 34851 filesTotal, 17143873836 sizeTotal
 Got fatal error during xfer (No files dumped for share /)

 TarShareName is set to /

 Do you know what the 1 error is?  Check the per-PC log file for
 warnings or errors too.

 Can you double-check whether TarShareName is actually set to
 [/, /] (ie: / twice)?

 Craig


 Hi Craig --

 Thanks for taking the time to respond.

 I didn't see anything noteworthy in the per-PC log file.

 TarShareName is inherited from config.pl:
   $Conf{TarShareName} = '/';

 It was a little tricky to find the one error in over 100k lines of  
 screen dump. :-) Here it is, along with a few lines of context.

 tarExtract: Got file './Users/tina/Documents/2009/Forum Theatre  
 Project/Video files/Tape 1 - Day 1.mpg', mode 0777, size 1.2828e+09,  
 type 0
  pool 777501/20  1282796852 Users/tina/Documents/2009/Forum  
 Theatre Project/Video files/Tape 1 - Day 1.mpg
 tarExtract: Got file './Users/tina/Documents/2009/Forum Theatre  
 Project/Video files/Tape 2 - Day 1.mpg', mode 0777, size 5.18103e 
 +08, type 0
 tarExtract: Unable to open /var/lib/backuppc/pc/aubergine/new/f%2f/ 
 fUsers/ftina/fDocuments/f2009/fForum Theatre Project/fVideo files/ 
 fTape 2 - Day 1.mpg for writing after link fail
  create   777501/20   518103244 Users/tina/Documents/2009/Forum  
 Theatre Project/Video files/Tape 2 - Day 1.mpg
 tarExtract: Got file './Users/tina/Documents/2009/Forum Theatre  
 Project/Video files/Tape 3 - Day 2.mpg', mode 0777, size 1.28608e 
 +09, type 0
  pool 777501/20  1286076016 Users/tina/Documents/2009/Forum  
 Theatre Project/Video files/Tape 3 - Day 2.mpg
 tarExtract: Got file './Users/tina/Documents/2009/Forum Theatre  
 Project/Video files/Tape 4 - Day 3.mpg', mode 0777, size 1.79964e 
 +09, type 0
  pool 777501/20  1799642832 Users/tina/Documents/2009/Forum  
 Theatre Project/Video files/Tape 4 - Day 3.mpg

 The file in question seems to exist on disk (here under 163 instead  
 of new):
 ls -alFi /var/lib/backuppc/pc/aubergine/163/f%2f/fUsers/ftina/ 
 fDocuments/f2009/fForum Theatre Project/fVideo files/fTape 2 - Day  
 1.mpg
 48488926 -rw-r- 2 backuppc backuppc 518103244 2009-09-24 06:57 / 
 var/lib/backuppc/pc/aubergine/163/f%2f/fUsers/ftina/fDocuments/f2009/ 
 fForum Theatre Project/fVideo files/fTape 2 - Day 1.mpg

 I did find an entry in the pool with that inode:
 find /var/lib/backuppc/pool/ -inum 48488926
 /var/lib/backuppc/pool/5/c/f/5cf19104c918549fd2052c8db99ad836

 There's a good likelihood the backup problems started when these  
 files were placed on the client.

 -Lorrin


--
Come build with us! The BlackBerry(R) Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9 - 12, 2009. Register now!
http://p.sf.net/sfu/devconference
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/