On Thu, Mar 09, 2006 at 06:31:51PM -0800, Matt wrote:
> Wouldn't it be better to keep the directory structure of the
> (compressed) files and keep hash and attributes in the DB? After all,
> this is how the data are received and how they will be accessed during
> a restore.
Even that isn't all
On Thu, 2006-09-03 at 18:31 -0800, Matt wrote:
> Speedwise, backuppc really sucks compared to dirvish. With dirvish I has
> able to backup 1.4 TB in 30 minutes. Most of the time was spend by
> rsync collecting and transmitting the file lists. Now a full backups
> take almost a day, even if the a
I am having nothing but trouble with doing backups.. I have a backup i'm trying to, its at least 1,000,000 files.. it takes over an hour to build the list...and it seems to start coping files but then just randonmly exists, as shown below.. and it does it at different stages all the time, so its no
David Brown wrote:
> I think the solution is to change backuppc to not
>
>create multiple trees, but to store the filesystem tree in some kind of
>database, and just store the files themselves in the pool. Database
>engines are optimized to be able to handle multiple indexing into the data,
>wher
So are you saying that using Tar over
ssh would be a better option in this case? (Im not sure what netcat is)
Carl Wilhelm Soderstrom
<[EMAIL PROTECTED]>
Sent by: [EMAIL PROTECTED]
10/03/2006 08:27 AM
To
backuppc-users@lists.sourceforge.net
cc
Subject
Re: [BackupPC-users] CPU usage
On 3/9/06, Carl Wilhelm Soderstrom <[EMAIL PROTECTED]> wrote:
> > On Thu, 2006-03-09 at 14:40, [EMAIL PROTECTED] wrote:
> > > it just hovers at about 300kb/s I would expect that when the file
> > > listing is sent, for there to be a heavy load on the network at that
> > > time, and then some he
Perhaps a better question would be how to prevent the offending file
from changing while it's being read by rsync. After all, a backup of a
large file that has changed mid-copy isn't really a backup, is it... :-)
If the source is a Linux machine there are tools such as LVM that will
allow you to
Sorry, obviously that was for my IT team here at work.
You are a backuppc developer aren't you Carl? You certainly know a lot..
Cheers
L.
> Hola,
>
> I saw this on the backuppc list, and thought it might interest some of you.
>
> Carl is one of the main Backuppc developers.
>
> Cheers
>
On 03/09 01:43 , David Brown wrote:
> Does --whole-file help, or is it slow even at that?
dunno, never tried that option. :)
--
Carl Soderstrom
Systems Administrator
Real-Time Enterprises
www.real-time.com
---
This SF.Net email is sponsored b
On Thu, Mar 09, 2006 at 03:27:32PM -0600, Carl Wilhelm Soderstrom wrote:
> > On Thu, 2006-03-09 at 14:40, [EMAIL PROTECTED] wrote:
> > > it just hovers at about 300kb/s I would expect that when the file
> > > listing is sent, for there to be a heavy load on the network at that
> > > time, and t
IMHO, BackupPC could use a combination of index data in the database
and files on FS. The database can be anything - sqlite3, mysql, even
ODBC-compliant.. That would speed up some checks, I beleive2006/3/8, David Brown <[EMAIL PROTECTED]>:
On Tue, Mar 07, 2006 at 09:23:36AM -0600, Carl Wilhelm Sode
> On Thu, 2006-03-09 at 14:40, [EMAIL PROTECTED] wrote:
> > it just hovers at about 300kb/s I would expect that when the file
> > listing is sent, for there to be a heavy load on the network at that
> > time, and then some heavy cpu work, and then more load on the network
> > as it hits files t
On Thu, 2006-03-09 at 14:40, [EMAIL PROTECTED] wrote:
> Les,
> I am not too distressed by the processor usage, as this
> machine is dedicated to baqckuppc, but I am trying to workout why the
> network utilisation is so low. I would expect that if two machines had
> a gigabit connection then
Les,
I
am not too distressed by the processor usage, as this machine is dedicated
to baqckuppc, but I am trying to workout why the network utilisation is
so low. I would expect that if two machines had a gigabit connection then
I should be able to get throughput up to around 100mb/s (ish)
When I say 100%, I mean lookin at top
or system Monitor in Gnome, both show the processors running at 100% usage,
not the number of tasks running on the processor.
Guus Houtzager <[EMAIL PROTECTED]>
Sent by: [EMAIL PROTECTED]
10/03/2006 03:14 AM
To
backuppc-users@lists.sourceforge.net
Could someone please point me at where the format of XferLOG is described? It's
actually the third column that I'm not sure of, but it if there's stuff that
I've missed to read I'd rather do that than keep asking little questions.
---
This SF.N
Just use the $Conf option. Assuming /var is on a seperate partition:
$Conf{BackupFilesExclude} = {
'/var' => [
'/tmp',
],
}
If it is on the / partition:
$Conf{BackupFilesExclude} = {
'/' => [
'/var/tmp',
],
}
If you really want to
Does anyone have any working examples of using --exclude with rsync? I've tried the following syntax already: '--exclude', '/var/tmp/', '--exclude', '/var/tmp',Here's the appropriate section from the per-pc
config.pl with the latest attempt at the bottom:$Conf{RsyncArgs} = [ # #
Hi
I see, I can restore backups directly, as zip or as tar. I have a problem
with this:
The zip-version will propably not contain any access rights, owner
information and so on. Afaik, zip does not support these.
The tar-version will be large, as most of my data is highly compressable. I
would nee
I've been having problems with some of my backups, resulting in apparent
success but empty backup directories. (We backup the C$ share using
samba/smbclient with BackupPC. The failing client runs Windows XP SP2.)
From BackupPC's perspective, the symptoms are
NT_STATUS_INSUFF_SERVER_RESOURCES e
How do I login to backuppc as different users? I can't seem to find a
way to "logout" as the current user. Does BackupPC have a logout
facility ???
Does the login use cookies or someother way to remember my login ??
How do I clear this???
Thanks,
Brendan.
-
On Tue, Mar 07, 2006 at 09:23:36AM -0600, Carl Wilhelm Soderstrom wrote:
> I'm experimenting with an external firewire drive enclosure, and I formatted
> it with 3 different filesystems, then used bonnie++ to generate 10GB of
> sequential data, and 1,024,000 small files between 1000 and 100 bytes
On Thu, 2006-03-09 at 01:23, [EMAIL PROTECTED] wrote:
> I have a dual core AMD Athlon 64 processor and when backups are taking
> place the processor usage generally sits at 100% on both cores (When
> multiple machines are being backed up), if not at 100% then generally
> very close to it
>
>
On Thu, 2006-03-09 at 18:23 +1100, [EMAIL PROTECTED] wrote:
>
> I was hoping to find out if anyone else has seen this situation:
>
> I have a dual core AMD Athlon 64 processor and when backups are taking
> place the processor usage generally sits at 100% on both cores (When
> multiple machines a
Hi! i've a little truble, i made a full backup of one computer and i
works well with a full and two incremental backups, but when backuppc
tries to do next backup it crashes, and now i cannot do neither full
or incremental cause backuppc says me the error that i attach at the
end.
the first thing
Gwenn,
I
have tried that, with compression set to 0, processor usage still sits
at 100% on both cores, and the bandwidth usage is sitting at 2MB/s
Jamie
Gwenn Boussard <[EMAIL PROTECTED]>
09/03/06 07:39 PM
To
[EMAIL PROTECTED]
cc
backuppc-users
Subject
Re: [BackupPC-use
Hello,
In my site, for authentification, I have created a login page in php
with the possibility for each users to choose a appropriate mecanism
for authentification (like LDAP, NIS,...)
when the authentification is good, I create cookies :
-
if($checklogin===TRUE) {
I have a dual core AMD Athlon 64 processor and when backups are taking
place the processor usage generally sits at 100% on both cores (When
multiple machines are being backed up), if not at 100% then generally
very close to it
At the same time, 3 hosts being backed up, two of which have gi
28 matches
Mail list logo