Hi Torsten.

Good idea to get local host going first :)
Quick question though.
What is "GetTar"? Why is it in /etc/backuppc/ directory?
Is a binary or script that comes with backuppc?  If so, I don't have it!!!

Thanks,
Brendan.

[EMAIL PROTECTED] wrote:

Message: 4 Date: Wed, 28 Sep 2005 11:26:00 +0200 Subject: Re: [BackupPC-users] Errors backing up Linux server From: Torsten Sadowski <[EMAIL PROTECTED]> To: backuppc-users@lists.sourceforge.net Organization: TU Berlin Hi Brendan, I'm using tar over ssh for the backup of Linux machines and just tar for localhost (the backuppc server). My localhost.pl looks like this: $Conf{XferMethod} = 'tar'; $Conf{BackupFilesExclude} = ['/var/lib/backuppc']; $Conf{TarIncrArgs} = '--newer=$incrDate $fileList'; $Conf{TarShareName} = ['/bin','/boot','/dev','/etc','/home','/initrd','/lib','/o pt','/root','/sbin','/usr','/var']; $Conf{TarClientCmd} = '/usr/bin/sudo /etc/backuppc/GetTar -v -f - -C ' . '$shareName --totals'; As you see I'm backing up /dev so special files are no problem. You should try to make a localhost backup first. I had some "File list receive failed" problems with windows and there it was the firewall. Do the other Linux servers have one? If they have, open TCP port 873 for rsync to work. HTH, Torsten On 2005-09-28 02:54:14 +0200 Brendan Simon <[EMAIL PROTECTED]> wrote:

Hi,

I'm a backuppc newbie and am trying to use backuppc on a Debian Linux host (AMD Athlon 1.4GHz, 256MB) to backup some Linux servers. At the moment I'm just trying to backup the home directory on one Debian Linux PPC server. The home directory is about 40GB. Eventually I'd like to backup the entire PC (ie. all partitions).

I've tried tar/nfs & rsync/ssh but the backup fails.
When using rsync it gets a broken pipe and/or timeout after a couple of days of trying to receive the file list. The home directory contains special files (eg. /home/user/project.1.2.3/dev/null) as we develop a linux filesystem. I was wondering if rsync was barfing on these types of files because trying to read /dev/null would cause it to read for ever.

I assume that backuppc (tar or rsync, etc) is smart enough to not _read_ the contents of special files. Is that correct? My assumption is based on backuppc being able to backup an entire machine including the root filesystem, /dev directory, etc.
Is there something special I have to setup in my config ??

I presume that backing up 40+GB with backuppc is also doable?
Are there any limitations of file size or performance, etc?

Would rsyncd be a better approach?
I can only assume ssh would slow things down and since I'm on a lan behind a firewall I don't really need the security across the lan. Then again I guess peoples data could be sniffed off the network, but unlikely if using switches.

Thanks for any help.
Brendan.





-------------------------------------------------------
This SF.Net email is sponsored by:
Power Architecture Resource Center: Free content, downloads, discussions,
and more. http://solutions.newsforge.com/ibmarch.tmpl
_______________________________________________
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/

Reply via email to