Hi,
Saturday, August 15, 2009, 3:42:53 PM, you wrote:
>> > I'm experiencing some strange difficulties with BackupPC
>> > (3.1.0-3ubuntu1 on Ubuntu 8.04 LTS). It appears that BackupPC is not
>> > "recognizing" that it put files into the pool already. The log shows
>> > nightly a message according
I don't know what you mean with SUA environment, but I use Deltacopy in
Vista 64 bit via rsyncd.
http://www.aboutmyip.com/AboutMyXApp/DeltaCopy.jsp
Works without a problem atm. Easy to use and you can copy the files to other
computers and easily register the service.
Greetings,
Koen Linders
---
Hi,
Andreas Pagander wrote on 2009-08-11 20:45:50 +0200 [[BackupPC-users] Offsite
backup using archive host]:
> I have some thought on automating an offsite backup using the Archive Host.
>
> -Use archive host to make a raw archive of the host
> -Rename the file(s)
> -Use rsync to sync the raw f
Hi,
Steve Blackwell wrote on 2009-08-13 00:26:20 -0400 [Re: [BackupPC-users]
100,000+ errors in last nights backup]:
> I started to reply to your e-mail but my system crashed. The messages
> log suggests that backuppc may have been the culprit. See below.
hmm. See below.
> On Thu, 13 Aug 2009 0
> that's because you've got an extraneous parameter (), which probably
> evaluates to 'true'.
Ahhh -- I read the instructions incorrectly (or too fast -- same thing).
Thanks!
> Yes. But I agree with Les that it doesn't hurt to fall back to automatic
> scheduling when your cron jobs fail for some
Hi,
Adam Goryachev wrote on 2009-08-13 15:42:26 +1000 [Re: [BackupPC-users]
100,000+ errors in last nights backup]:
> [...]
> I've frequently managed to cause two backuppc_dump's to run in parallel
> where one was scheduled by backuppc and one was run manually by me from
> the command line. It wo
Hi,
Les Mikesell wrote on 2009-08-17 16:38:23 -0500 [Re: [BackupPC-users] Is there
a speed setting?]:
> Jeremy Mann wrote:
> > I'm watching a live output of Ganglia showing network usage while the
> > backups are going. Also simple math.. I just finished one full backup, 16
> > GB in 143 minutes.
Hi,
Richard Hansen wrote on 2009-08-14 14:36:06 -0400 [Re: [BackupPC-users] rsync
clients run out of memory]:
> Les Mikesell wrote:
> > [...]
> > If they are grouped in several subdirectories, you could break the
> > backups into separate runs. If they are all in one directory, even
> > protoc
Hi,
Bryan Gintz wrote on 2009-08-17 12:59:06 -0400 [[BackupPC-users] Full Size (GB)
showing 0.00 in Host Summary]:
> Any idea why it would show 0.00 for Full Size in Host Summary for my 2
> hosts (currently localhost, and 1 windows client), even though the
> current pool (in "Status" page) show
Jon Craig wrote:
> On Mon, Aug 17, 2009 at 5:38 PM, Les Mikesell
> wrote:
>> Jeremy Mann wrote:
>
>>> I'm watching a live output of Ganglia showing network usage while the
>>> backups are going. Also simple math.. I just finished one full backup,
>>> 16
>>> GB in 143 minutes. That's simply unaccep
Hi,
Clint Alexander wrote on 2009-08-17 09:10:20 -0400 [Re: [BackupPC-users]
Feature Requests and such?]:
> [...]
> I tested the idea of using Cron for incrementals:
> BackupPC/bin/BackupPC_serverMesg backup 0
>
> I used 0 for incremental, but for some strange reason -- it wanted to
> perf
Hi,
Adam Goryachev wrote on 2009-08-18 10:42:42 +1000 [Re: [BackupPC-users] Feature
Requests and such?]:
> [...]
> > The point is that rsync needs to see the full BackupPC pool during one run.
> > You can't split it up into seperate syncs of pool/, cpool/, pc/host1/,
> > pc/host2/ and so on.
>
>
On Mon, Aug 17, 2009 at 5:38 PM, Les Mikesell wrote:
> Jeremy Mann wrote:
>> I'm watching a live output of Ganglia showing network usage while the
>> backups are going. Also simple math.. I just finished one full backup, 16
>> GB in 143 minutes. That's simply unacceptable for a full backup.
>
Tha
Hi,
Adam Goryachev wrote on 2009-08-17 13:15:59 +1000 [Re: [BackupPC-users] Feature
Requests and such?]:
> Clint Alexander wrote:
> > Hi BackupPC List Members (and owners)..
> >
> > One thing that has frustrated the heck out of me is the lack of finding
> > 1) I want to see *when* a backup is g
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Holger Parplies wrote:
> Hi list,
>
> I don't seem to find as much time for posting on this list lately as
I'd like
> to. Since that won't change for at least two weeks, I'll write a few short
> comments now. Please excuse me for being a bit terse abou
Hi list,
I don't seem to find as much time for posting on this list lately as I'd like
to. Since that won't change for at least two weeks, I'll write a few short
comments now. Please excuse me for being a bit terse about it and only
providing some keywords. I hope others will follow up with more i
Hi,
anyone successfully using the SUA environment for backing up a windows
vista 64bit client via ssh-rsync or rsyncd?
I failed running cygwin on Vista Business 6.0 64-bit and considered
giving MS a chance ...
Any comments very much appreciated,
thanks in advance,
Bernhard
Jeremy Mann wrote:
>
>> What operations are you watching to see these numbers? The only one
>> where network bandwidth matters much is the initial copy of a new host.
>> The rest of the time you are mostly doing comparisions. Backuppc
>> will be slower than native rsync because it is in perl
Les Mikesell wrote:
> What operations are you watching to see these numbers? The only one
> where network bandwidth matters much is the initial copy of a new host.
> The rest of the time you are mostly doing comparisions. Backuppc
> will be slower than native rsync because it is in perl and
Jeremy Mann wrote:
> Filipe Brandenburger wrote:
>
>> 50Mbps is actually quite a lot, and it's probably close to the
>> bottleneck of your disks. You should use "iostat" on client and server
>> while backups are running to see if you're getting 100%util of the
>> disks that are being backed up.
>>
Michael Stowe wrote:
>
> So, wait, something in you network is capable of sustaining this kind of
> bandwidth usage in anything other than short bursts? Were you sending
> sparse files or something? Were you backing up /dev/random?
>
> I'd really like to hear about the setup that managed to use
So, wait, something in you network is capable of sustaining this kind of
bandwidth usage in anything other than short bursts? Were you sending
sparse files or something? Were you backing up /dev/random?
I'd really like to hear about the setup that managed to usefully transfer
data anywhere nea
On 16 Aug 2009 at 14:11, Michael 'Moose' Dinn wrote:
> > another unit offsite. The extensive use of hardlinks prevents
> > rsync from being the right solution here.
>
> rsync -H doesn't work for you?
>From earlier discussions on the list:
Re: [BackupPC-users] backup of backup machine
"
Filipe Brandenburger wrote:
> 50Mbps is actually quite a lot, and it's probably close to the
> bottleneck of your disks. You should use "iostat" on client and server
> while backups are running to see if you're getting 100%util of the
> disks that are being backed up.
>
> In BackupPC's case, as i
Hi,
On Mon, Aug 17, 2009 at 14:55, Jeremy Mann wrote:
> Just curious if there is a bandwidth speed setting for BackupPC because
> I'm not seeing a lot of bandwidth when the backups occur. All my servers
> are on gigE and I'm not even seeing 50Mbit speeds between the servers and
> backupPC server.
Jeremy Mann wrote:
> Just curious if there is a bandwidth speed setting for BackupPC because
> I'm not seeing a lot of bandwidth when the backups occur. All my servers
> are on gigE and I'm not even seeing 50Mbit speeds between the servers and
> backupPC server.
There isn't one built in, but 50Mb
Just curious if there is a bandwidth speed setting for BackupPC because
I'm not seeing a lot of bandwidth when the backups occur. All my servers
are on gigE and I'm not even seeing 50Mbit speeds between the servers and
backupPC server.
--
Jeremy Mann
jer...@biochem.uthscsa.edu
University of Tex
kirrus wrote:
>
> Trying on the command line, with the backuppc user & pass, gives this:
> rsync --port 873 roger-d...@roger-desk::kDrive
> Password:
> @ERROR: chdir failed
> rsync error: error starting client-server protocol (code 5) at
> main.c(1383) [receiver=2.6.9]
>
The reason is always tha
Bryan Gintz wrote:
> I am not sure if this is asked or answered somewhere else. If it is,
> could someone point me there?
>
> I am wondering if there is a way to setup the ability to backup
> "RsyncShareName" #1 on a different schedule than "RsyncShareName" #2.
> For example, I want to backup
Any idea why it would show 0.00 for Full Size in Host Summary for my 2
hosts (currently localhost, and 1 windows client), even though the
current pool (in "Status" page) shows 13.89G ?
Thanks,
Bryan
--
Bryan Gintz
IT Coordinator
Central Ohio Youth for Christ
Email: bgi...@coyfc.org
Phone: [W
I am not sure if this is asked or answered somewhere else. If it is,
could someone point me there?
I am wondering if there is a way to setup the ability to backup
"RsyncShareName" #1 on a different schedule than "RsyncShareName" #2.
For example, I want to backup my entire "My Documents" folde
Clint Alexander wrote:
>
> But the real question is: If I want to run these 4 backups from Cron, I
> should turn the BackupsDisabled to 1, right?
> This should prevent another backup from running needlessly, but does this
> interfere with a command being run from Cron?
You could just bump the Fu
> I may be out of line here, but I think you might have a config similar
> to this:
I'm not the best when it comes to being completely educated in the way
backups systems work. My area of expertise can be better defined as a "Jack
of All" and master of few. But, I took some time to study the dif
On 08/14 11:46 , Kanwar Ranbir Sandhu wrote:
> Problem solved! I found a gnome mplayer cache dir in my home dir which
> had MILLIONS of files in them. They were chewing up GBs of space, and
> over 31% of the inodes in /home. I have no idea how those files were
> created.
On a slightly related n
David wrote:
>
> Where the real problem comes in, is if admins want to use 'updatedb',
> or 'du' on the linux system. updatedb gets a *huge* database and uses
> up tonnes of cpu & ram (so, I usually disable it). And 'du' can take
> days to run, and make multi-gb files.
You can exclude directorie
Hi there.
Firstly, this isn't a backuppc-specific question, but it is of
relevance to backup-pc users (due to backuppc architecture), so there
might be people here with insight on the subject (or maybe someone can
point me to a more relevant project or mailing list).
My problem is as follows... w
36 matches
Mail list logo