Hello,
Strange situation and strange behavior
the backup of the the PC working during a long time...
but the the harddisk crashs...
during one week no backup are taken...
I replace the drive and I put the PC on network but I'm forget to
disable backup :-(
BackupPC take a full backup (36) with 0
Unless you have a specific reason not to, absolutely! I would use
multi-level incrementals. I've always liked a tower of hanoi backup scheme
(http://www.google.com/search?hl=en&q=%22tower+of+hanoi%22+backup+rotation),
but there are others that work well too.
That said, it's still in your best i
It is BackupPC 3.0 but recently upgraded from 2.1.2.
I have only $Conf{IncrLevels} = [1];
Are you suggesting I implement multi-levels?
Tony
On Jun 21, 2007, at 10:34 AM, Stephen Joyce wrote:
> If you're using 2.1.x, then yes. All incrementals are based off of
> the full and will check/transfe
I'm experiencing something and I just want to check if I'm
understanding what is happening.
I backup Linux-Linux with XferMethod = rsync. Occasionally a full
backup of one of my large systems fails for one reason or another. So
I exclude some big directories and run it again, and then have
Hi
I am using a Linux machine to backup 76 windows computers and 3 Linux
servers. All backup is via rsync - Cygwin on the Windows machines.
Incremental backups are working fine, but full backups of the windows
machines contain files and directories 1 level deep and nothing else. (i.e.
All second
Some big file can't be backuped by BackupPC.
I have a big file (15707164 bytes). When BackupPC try to tranfert it to
the server, BackupPC breaks with this message:
Error reading file \Mes documents\frontispiece.tif : Call timed out: server
did not respond after 20 milliseconds
Didn't get ent
Guy Malacrida wrote:
>
> To be honest I am still in doubt for you are offering a BackupPC_tarCreate
> command and I fail to understand how you could split the outcome in, say,
> tranches of 4.7GB to then store on DVDs.
If you have temporary disk space to store the result, you can pipe
the tar out
Thanks Daniel for your help,
To be honest I am still in doubt for you are offering a BackupPC_tarCreate
command and I fail to understand how you could split the outcome in, say,
tranches of 4.7GB to then store on DVDs.
Meantime I read again the documentation Anyway, I tried to use
BackupFile
Hi,
in a same idea you can use (many) dd to create a buffer
BackupPC_tarCreate -h pontiac -n -1 -s \* . | gzip -c | dd obs=512k | dd
obs=512k of=/dev/nst0
Need to be tested as well.
Dan Smisko wrote:
> I would try the "buffer" util. It will reblock and buffer the
> input data. The default
Hi all,
BackupPC_archive -h yields the following:
usage: ./BackupPC_archive
Having a few cracks at it yields:
./BackupPC_archive: bad reqFileName (arg #3):
/tina/backuppc/DATA/pc/elephant/0/Users/test/
Does anyone care to provide an example of how to set off a
BackupPC_archive a directory
Hi all,
It is my interpretation that BackupPC will 'fire-up' at the configured
wake intervals to opportunistically do backups outside of blackout
periods defined globally or overridden per-host. If I am wrong on this
then someone please bail me up right now to correct my understanding
before I
11 matches
Mail list logo