On Thu, 6 Dec 2007, dan wrote:
> i have been experimenting with this and think i have a workable
> solution. only problem is that nexenta is a bit of a pain to get
> backuppc running on. just seems to have a lot of little issues with
> backuppc 3.1 (running nexenta a7)
nexenta is Debian/Ubuntu-
i have been experimenting with this and think i have a workable solution.
only problem is that nexenta is a bit of a pain to get backuppc running on.
just seems to have a lot of little issues with backuppc 3.1 (running nexenta
a7)
i just downloaded solaris express to see if i have better luck with
dan wrote:
> i understand that unison uses the same algorythm but does it have the same
> memory issues when transfering a lot of files?
There is a small issue with the number of files in that the whole
directory listing is tranferred and held in memory during the
comparison, but the real proble
On Thu, 6 Dec 2007, Carl Wilhelm Soderstrom wrote:
so the only way I can see replication helping, is if you don't have time
to do multiple backups in a given cycle. this is certainly the case in
many situations; but if it's not, then it may be adviseable to not
bother trying to build a replican
Hello Rich,
thanks for the tip. That was it. I just assumed that it was installed
and didn't read the errors carefully enough.
I appreciate your help.
-Matthew
Rich Rauenzahn wrote:
>
> Matthew Metzger wrote:
>> /bin/sh: cc: command not found
>>
> Install a compiler/gcc...?
>> make[1]: *
Matthew Metzger wrote:
> /bin/sh: cc: command not found
>
Install a compiler/gcc...?
> make[1]: *** [Digest.o] Error 127
> make[1]: Leaving directory `/home/sysadmin/File-RsyncP-0.68/Digest'
> make: *** [subdirs] Error 2
> ---
>
> I get the same type of errors when trying to instal
On 12/06 04:49 , dan wrote:
> alternatively i think i may just break rsync up into smaller jobs and sync
> each specific directory seperately with a 'for x in 'ls /'; do rsync $x
> rsyncd://serveraddress/' script. that would help limit the number of files
> in each rsync interation and work around
i understand that unison uses the same algorythm but does it have the same
memory issues when transfering a lot of files?
alternatively i think i may just break rsync up into smaller jobs and sync
each specific directory seperately with a 'for x in 'ls /'; do rsync $x
rsyncd://serveraddress/' scri
Hello,
I had previously installed this perl module with "sudo aptitude install
libfile-rsyncp-perl" (I'm on Ubuntu Server 6.06). I got this warning
upon the completion of the BackupPC installer script configure.pl:
Warning: you need to upgrade File::RsyncP; I found 0.52 and BackupP
needs 0.68
Jon Forrest wrote:
> Craig Barratt wrote:
>
> First of all, thanks so much for replying. Please take my comments
> as respectful suggestions.
>
>> Your BackupFilesOnly is backwards. It should map the share name
>> to the list of files/directories to backup for that share:
>>
>> $Conf{BackupF
Craig Barratt wrote:
First of all, thanks so much for replying. Please take my comments
as respectful suggestions.
> Your BackupFilesOnly is backwards. It should map the share name
> to the list of files/directories to backup for that share:
>
> $Conf{BackupFilesOnly} = {
> '/' => [
Le Thu, 6 Dec 2007 07:53:00 -0700,
dan <[EMAIL PROTECTED]> a écrit :
> i am currently cloning my backuppc server to a remote server in
> another city via rsync. as i understand it, as my file could grows
> rsync will become unfriendly.
>
> i am completely unfamiliar with 'unison' and am wonderin
i am currently cloning my backuppc server to a remote server in another city
via rsync. as i understand it, as my file could grows rsync will become
unfriendly.
i am completely unfamiliar with 'unison' and am wondering if anyone know is
unison has the same issue as rsync with many many files and
Hi all:
I am using BackupPC-3.1.0 and I want to back up a share using the
following exclusion patterns under rsync:
+ */
+ */Maildir/**
- *
basically I want to include any directory tree starting with the
directory Maildir one level below the root of the transfer: specified
as /home, and p
14 matches
Mail list logo