Re: [BackupPC-users] more efficient: dump archives over the internet or copy the whole pool?

2010-11-02 Thread Craig Barratt
Frank, > Anyway, I thought I had it all figured out, but when I decrypt, gunzip, and > untar the resulting file, I get some "tar: Skipping to next header" messages > in the output, and, although I do get some files out of the archive, > eventually tar just hangs. Does the original tar archive

Re: [BackupPC-users] rsync --max-size parameter not honored in BackupPC

2010-11-02 Thread Craig Barratt
Andreas writes: > At our site files larger than 10BG are usually recreated faster than > restored from backup, therefore we added to the "RsyncExtraArgs" the parameter > "--max-size=100". Unfortunately rsync implements that on the receiving side (ie: server), and File::RsyncP doesn't impl

[BackupPC-users] rsync --max-size parameter not honored in BackupPC

2010-11-02 Thread gregwm
i'd just exclude them by name/pattern until a better answer surfaces > At our site files larger than 10BG are usually recreated faster than > restored from backup, therefore we added to the "RsyncExtraArgs" the > parameter > "--max-size=100". > > Although this parameter is visible in the r

Re: [BackupPC-users] more efficient: dump archives over the internet or copy the whole pool?

2010-11-02 Thread Les Mikesell
On 11/2/2010 4:22 PM, Frank J. Gómez wrote: > Thanks for your response, Les. > > Regarding the hardlinks, I was thinking (perhaps incorrectly) that since > I'd be putting an encrypted tar.gz on S3 (rather than all the individual > files) that the hardlinking wouldn't be an issue and that the > non-

Re: [BackupPC-users] more efficient: dump archives over the internet or copy the whole pool?

2010-11-02 Thread Frank J . Gómez
Thanks for your response, Les. Regarding the hardlinks, I was thinking (perhaps incorrectly) that since I'd be putting an encrypted tar.gz on S3 (rather than all the individual files) that the hardlinking wouldn't be an issue and that the non-redundancy would be preserved in the tar. I don't see

Re: [BackupPC-users] Fully disable Incremental

2010-11-02 Thread Timothy Omer
On 29 October 2010 13:35, Les Mikesell wrote: > > On 10/29/10 6:06 AM, Timothy Omer wrote: > > Hi all, > > > > I am only running full backups for my clients to make sure we are > > transferring the minimal amount of data possible. > > > > My Schedule... > > FullPeriod - 6.67 > > FullKeepCnt - 12 >

Re: [BackupPC-users] more efficient: dump archives over the internet or copy the whole pool?

2010-11-02 Thread Les Mikesell
On 11/2/2010 2:42 PM, Frank J. Gómez wrote: > A little background: > == > I've been hacking on a copy of BackupPC_archiveHost to run archives > through GPG before saving them to disk. The reason for this is that, > when I say "saving to disk," I mean saving to an Amazon s3 share mounte

Re: [BackupPC-users] How does BackupPC work?

2010-11-02 Thread Tyler J. Wagner
On Tue, 2010-11-02 at 13:15 -0500, Carl Wilhelm Soderstrom wrote: > On 10/29 08:16 , Richard Shaw wrote: > > There is surprisingly little info on how BackupPC really works, at > > least with the google searches I've tried. I'm just looking for a > > concise overview of how the different backup meth

[BackupPC-users] more efficient: dump archives over the internet or copy the whole pool?

2010-11-02 Thread Frank J . Gómez
A little background: == I've been hacking on a copy of BackupPC_archiveHost to run archives through GPG before saving them to disk. The reason for this is that, when I say "saving to disk," I mean saving to an Amazon s3 share mounted locally via s3fs

Re: [BackupPC-users] How does BackupPC work?

2010-11-02 Thread Carl Wilhelm Soderstrom
On 10/29 08:16 , Richard Shaw wrote: > There is surprisingly little info on how BackupPC really works, at > least with the google searches I've tried. I'm just looking for a > concise overview of how the different backup methods work and how they > are different from one another. Reading /etc/back

Re: [BackupPC-users] "File::RsyncP module doesn't exist" but the perl module is installed

2010-11-02 Thread Farmol SPA
Original Message Subject: Re: [BackupPC-users] "File::RsyncP module doesn't exist" but the perl module is installed From: Massimo Balestra To: backuppc-users@lists.sourceforge.net Date: Fri Oct 29 2010 17:05:38 GMT+0200 (ora Legale Europa Occidentale) > > Ciao Alessandro, > (Ti

Re: [BackupPC-users] Cpool nightly clean removed 190 files from where??

2010-11-02 Thread gregwm
>> /e4/v/h/backuppc/bin/BackupPC_zcat LOG.2.z >> /e4/v/h/backuppc/bin/BackupPC_zcat LOG.1.z >> denied at /v/h/backuppc/bin/BackupPC_dump line 193 >> 2010-10-28 17:15:11  admin : Can't read /bc/backuppcdata/pc: No such >> file or directory at /v/h/backuppc/bin/BackupPC_sendEmail line 165. > > Why ar

[BackupPC-users] Using Wildcards in include file paths.

2010-11-02 Thread swisstone
Using Wildcards in include file paths. The page on the wiki might help: http://sourceforge.net/apps/mediawiki/backuppc/index.php?title=Smb_exclude Just remember that with smb you can either include specific directories/files within a share, or exclude. You cannot do both. There is a nice re

[BackupPC-users] brackup?

2010-11-02 Thread Les Mikesell
Has anyone run across 'brackup' (http://search.cpan.org/~bradfitz/Brackup-1.10/lib/Brackup.pm)? It is just a command line tool, not much like backuppc, but it appears to have some very interesting concepts for the backend storage, chunking and encrypting the files and then is able to store the