David Rees wrote:
> On Dec 18, 2007 5:05 PM, Brendan Simon <[EMAIL PROTECTED]> wrote:
>
>> So is the bottleneck rsync or the number of files or memory ???
>>
>
> In this case, it's neither the number of files or memory.
>
> If you look at top in thi
David Relson wrote:
> On Mon, 17 Dec 2007 16:44:53 +1100
> Brendan Simon wrote
>> * The backuppc server is an Intel P3 800MHz with 512MB RAM and
>> 550GB or raid storage for data backups.
>> * The linux host I am backing up is a Dual Processor AM
I want my users to have a bit more control and feedback for their backups.
At the moment I have to add htpassword accounts for them on the backuppc
server. Also I think all email is going to [EMAIL PROTECTED] (for which
there is no real account).
How do I get email to go to user and/or admin a
Does anyone have any experiences or views on upgrading from 2.1.2 to
either 3.0.0 (3.0.0-4 on Debian) to 3.1.0 ??
Does 3.0.0 have known bugs that make it worth upgrading to 3.1.0 ??
Are there major changes to 3.1.0 over 3.0.0 that may make it dangerous
to upgrade to 3.1.0 over upgrading to 3.0.0
two clients.
>
> considering the 22.6GB i would guess that you have quite a lot of
> files. rsync has an issue with very high file counts.
>
> On Dec 16, 2007 6:01 PM, Brendan Simon < [EMAIL PROTECTED]
> <mailto:[EMAIL PROTECTED]>> wrote:
>
> I need to really s
John Pettitt wrote:
> Brendan Simon wrote:
>> I thought rsync is supposed to be more efficient than a raw copy, but
>> obviously not. I'm presuming it is rsync that is issue here.
>> Would rsyncd improve the speed?
>> Maybe it would be faster with nfs/tar but
I need to really speed up my backup of Linux boxes/directories !!!
I'm using ssh/rsync to do Linux backups. As an example (see end for
more details):
* a backup of a 22.6GB Linux directory is taking 2037 minutes (34
hours = 1.4 days).
* a backup of a 131.4GB WinXP directory is takin
I have backuppc running on a Debian Etch system.
For some reason my backups aren't occurring anymore. I can initiate
them manually via the web interface but they don't seem to start
automatically each night.
What reasons could prevent them starting automatically each night ???
Thanks, Brendan.
Cool.
So are you saying that 3.0.0 has no known critical bugs, and that 3.1.0
would have minor bug fixes/improvements and some new features. If so,
then I should be confident in upgrading from 2.1.1 to 3.0.0 and not have
any problems, right?
Has anyone had problems upgrading from 2.1.1 to 3.0.
I want to upgrade to BackupPC 3.x.y, but am not game to upgrade to
3.0.0. I'd like to at least wait until the first bug fix release (3.0.1).
Does anyone know when that is due for release?
Thanks, Brendan.
-
This SF.net em
I'm getting a fatal error when backing up an empty directory.
BackupPC server is running Debian Sarge (backuppc 2.1.1-2sarge2)
Server being backed up is running Debian Etch (rsync 2.6.9-2)
Surely it must be legitimate to backup up an empty directory. Is this a
bogus error message or maybe rsync
I'm get xfer errors on long path/filenames. total length is over 100 chars.
Is there a limit on the total path/filename size ???
Example error:
Read error: Connection reset by peer opening remote file
\bjsimon\ABCD\svn-sandbox\branches\firmware\P20-NiceBigProject-2\F1234.humongous-mutation\proj
How do I login to backuppc as different users? I can't seem to find a
way to "logout" as the current user. Does BackupPC have a logout
facility ???
Does the login use cookies or someother way to remember my login ??
How do I clear this???
Thanks,
Brendan.
-
Les Mikesell wrote:
If you are using basic http authentication the browser
remembers it until you close the browser.
OK. I think I am.
Are there other authentication methods besides using the htpasswd file?
I'm trying to create a setup where all our unix and windows boxes use a
centalise
How do I login to backuppc as different users? I can't seem to find a
way to "logout" as the current user. Does BackupPC have a logout
facility ???
Does the login use cookies or someother way to remember my login ??
How do I clear this???
Thanks,
Brendan.
--
I am using $Conf{ClientNameAlias} quite happily with rsync/ssh unix
hosts, but I can't get it to work with a Windows/smb server.
I assume it should work, or is this a bad assumption ???
Thanks,
Brendan.
---
This SF.Net email is sponsored by
I notice that if I do a full backup the summary page for the host
reports filled=yes, whereas if I do a incremental backup the page
reports filled=no.
Backup#
Type
Filled
Start Date
Duration/mins
Age/days
Server Backup Path
Hi,
I'm running BackupPC on Debian/Sarge. The status pages always says the
next backup will happen "today" sometime, but it never ever seems to
happen. If I backup hosts manually then things seem to work. Any idea
why the scheduling would not work ??? Below is a cut/paste of my
status page
Les Mikesell wrote:
That should already be happening. Rsync does transfer the entire
directory tree before starting and there is a certain amount of
memory overhead per file. If you are short of RAM on the backuppc
server, this could make the process very slow. You might have
a big improvement i
nd things
worked great). I use xtar over ssh (Mac OSX) and easily backup 40GB
(what is your backup window?).
Brendan Simon wrote:
Hi,
I'm using backuppc on a local network. I'm currently using ssh and
rsync but I don't think it is fast enough, at least for the full
backu
Hi,
I'm using backuppc on a local network. I'm currently using ssh and
rsync but I don't think it is fast enough, at least for the full
backups. I'm not quite sure where the bottleneck is (server, network,
ssh, rsync, tar, gz, etc).
For now I'll assume it's my ssh/rsync setup. I want to k
Hi,
Could someone tell me what the following errors mean?
Unexpected call
BackupPC::Xfer::RsyncFileIO->unlink(john/aegis/CN.1.5.1.4.C117/images/CN-image.tar.gz)
[ skipped 21 lines ]
Unexpected call
BackupPC::Xfer::RsyncFileIO->unlink(john/aegis/CN.1.5.1.4.C117/src/ethernetd/main.c)
[ skipped 4
I have the following errors when trying to backup a 40GB partition on a
remote unix server.
Are the missing files an issue?
Maybe the files were there when the filelist was built but were deleted
when the actual transfer was attempted?
My main concern is at the end where it says "Got fatal er
, not enough space, etc.
Cheers,
Brendan.
Brendan Simon wrote:
I fixed the remote login problem, but now my backups seem to be
exiting prematurely.
The only error message is:
backup failed (Child exited prematurely)
Anyone have any idea what could be causing this?
I'm using rsync/rsh fro
test directory on a PowerPC Debian Linux development server.
Thanks,
Brendan.
If you need it in other cases ist better to surround it with a test for
tty-ness:
[ -t ] && {
mesg n
}
Maarten
On Friday 30 September 2005 10:08, Brendan Simon wrote:
Craig Barratt wrote:
Craig Barratt wrote:
Brendan Simon writes:
OK. I got my local host backing up OK :)
Now when I try my remote server (Debian Linux PPC) with rsync/ssh I get
an error saying "fileListReceive failed". I have constrained the backup
directory to my own temp directory "/ho
ease edit your Subject line so it is more specific
than "Re: Contents of BackupPC-users digest..."
Today's Topics:
1. Re: Errors backing up Linux server (Brendan Simon)
2. Re: Errors backing up Linux server (Torsten Sadowski)
--__--__--
Message: 1
Date: Thu, 29 Sep 20
;m backing up /dev so special files are no
problem. You should try to make a localhost backup first. I had some
"File list receive failed" problems with windows and there it was the
firewall. Do the other Linux servers have one? If they have, open TCP
port 873 for rsync to work. HTH,
Hi,
I'm a backuppc newbie and am trying to use backuppc on a Debian Linux
host (AMD Athlon 1.4GHz, 256MB) to backup some Linux servers. At the
moment I'm just trying to backup the home directory on one Debian Linux
PPC server. The home directory is about 40GB. Eventually I'd like to
backup
29 matches
Mail list logo