On 11/3/2010 2:26 PM, martin f krafft wrote:
>
>> I'd run new full backups as soon as practical. That will at least
>> fix up anything missing in the latest run which is usually the
>> most important.
>
> Yeah, that's surely a good idea. I was wondering mostly about
> cleanup actually.
>
> I assume
also sprach Carl Wilhelm Soderstrom [2010.11.03.2020
+0100]:
> Run full backups on all hosts, then BackupPC_nightly?
also sprach Les Mikesell [2010.11.03.2022 +0100]:
> I'd run new full backups as soon as practical. That will at least
> fix up anything missing in the latest run which is usually
On 11/3/2010 1:16 PM, martin f krafft wrote:
> Hello,
>
> My filesystem holding the backuppc pool was corrupted. While e2fsck
> managed to fix it all and now doesn't complain anymore, I am a bit
> scared that the backuppc pool isn't consistent anymore.
>
> Is there a tool to check the consistency o
On 11/03 07:16 , martin f krafft wrote:
> Is there a tool to check the consistency of the pool?
>
> Is there a tool to repair an inconsistent pool?
Run full backups on all hosts, then BackupPC_nightly?
--
Carl Soderstrom
Systems Administrator
Real-Time Enterprises
www.real-time.com
---
Hello,
My filesystem holding the backuppc pool was corrupted. While e2fsck
managed to fix it all and now doesn't complain anymore, I am a bit
scared that the backuppc pool isn't consistent anymore.
Is there a tool to check the consistency of the pool?
Is there a tool to repair an inconsistent po
I am trying to backup certain directories on a mounted harddrive and I am
having a lot of difficulty. It is on the localhost and I am running Ubuntu
10.10. My issue is that the paths contain spaces (e.g. /media/Folder/This
Folder). Whenever I try to enter this path into the web config (or ***
One more way to do it ( more complex )
Modify last timestamp for a full backup in "backups" file in backuppc
directory.
like
/var/lib/backuppc/pc/$hostname/backups
This way you dont' have to modify crontab or start it manually from web.
--
--
Michael
-
On 11/3/2010 10:54 AM, Rob Poe wrote:
> Is there a way to control the day that the full backup happens? I have
> a server I'm backing up that has more than 450 gigs, and the full backup
> takes ~ 2 days, which I'd like to happen over a weekend.
>
> So I'd like the full to take place on a Friday.
I just have a cron job that starts a full backup at a specific time
and then have the full backup period slightly over 1 week.
here's the line from crontab:
0 20 * * sat /usr/local/BackupPC/bin/BackupPC_serverMesg backup
root 1
you have to put this in backuppc's crontab (or use su/sudo to cha
On 11/3/2010 11:54 AM, Rob Poe wrote:
> Is there a way to control the day that the full backup happens? I have
> a server I'm backing up that has more than 450 gigs, and the full backup
> takes ~ 2 days, which I'd like to happen over a weekend.
>
> So I'd like the full to take place on a Friday.
This may be way to complicated, but couldn't you create a loopback
filesystem that supports hardlinks in a file on amazon? I know you can
do encrypted loopback fs. You could even do a journaling fs with the
journal stored on a local device to help with the performance.
--Tod
On Wed, Nov 3, 2010 a
>
> to copy my backuppc volume offsite i wrote a script to pick
> (from /pc/*/backups) the 2 most recent incremental and the
> 2 most recent full backups from each backup set and rsync all that to the
> remote site. i'm ignoring (c)pool but the hardlinks still apply amongst the
> selected backups.
I used a udev rule (Centos 5) to automatically mount (once) the USB
drive, and then a script that runs via Cron to check and see if a USB
drive is mounted. Then it uses rsync to sync /var/lib/BackuPC to the
USB drive, runs a sync, then umounts the backup partition (thus
preventing the script f
Is there a way to control the day that the full backup happens? I have
a server I'm backing up that has more than 450 gigs, and the full backup
takes ~ 2 days, which I'd like to happen over a weekend.
So I'd like the full to take place on a Friday.
Unless BackupPC makes a "full" backup after t
On 11/3/2010 8:10 AM, Lee A. Connell wrote:
> I am using the single backup deletion script to clean out some backups
> manually, however when I run the script it tells me there are no backups
> for the backup number I chose. I can look under the host directory and I
> indeed do not see that number
> ...saving to an Amazon s3 share...
> ..."So you have a nice
> non-redundant repo, and you want to make it redundant before you push it
> over the net??? Talk sense man!"
>
> The main question:
> ==
> He thinks it would be more bandwidth-efficient to tar up and encrypt the
> pool, whic
I am using the single backup deletion script to clean out some backups
manually, however when I run the script it tells me there are no backups
for the backup number I chose. I can look under the host directory and
I indeed do not see that number as a backup folder. Where is this
information bein
Craig,
I started deconstructing my script last night before leaving work to see if
the tar corruption was somehow my own fault. I got most of the way through
without encountering problems, so I'm beginning to think I botched the
redirection somewhere along the way. I'm out of the office today bu
Original Message
Subject: Re: [BackupPC-users] "File::RsyncP module doesn't exist" but
the perl module is installed
From: Tyler J. Wagner
To: General list for user discussion, questions and support
Date: Fri Oct 29 2010 18:08:56 GMT+0200 (ora Legale Europa Occidentale)
> On Fri
19 matches
Mail list logo