zcat or you can gunzip
Regards,
Mike
On Sep 13, 2010, at 1:20 AM, IvyAlice wrote:
> Hello everybody,
>
> I got some problems with backuppc and I want to read the XferLOG.z file.
> I can't read it either with vim or nano and I can't extract the file with
> uncompress, gunzi,zcat or tar (it t
I hate bring this up again, but after taking advice from Les and John,
I'm not seeing what I think I should be seeing. After changing my
current config to the one below, I started to have incr, incr, full,
incr, incr, full, but the full's were doing the entire 600G.
Here's what I have for my Host
On Fri, Feb 19, 2010 at 10:11 AM, John Moorhouse
wrote:
> I'm happily using backupPC to backup a number of machine within our home
> network, I'm wondering what will happen if I use it to backup the file on the
> host machine that is the virtual disc for a number of virtual box VM, will it
> ha
On Fri, Feb 19, 2010 at 9:53 AM, Timothy J Massey wrote:
> Mike Bydalek wrote on 02/19/2010
> 11:28:25 AM:
>
>> Thanks for all the input. I'm starting to fully understand how
>> BackupPC scheduling is working now. My apologies for not stating that
>> I was/
All,
Thanks for all the input. I'm starting to fully understand how
BackupPC scheduling is working now. My apologies for not stating that
I was/am using rsync as it is the only choice that makes sense =)
On Fri, Feb 19, 2010 at 12:15 AM, Craig Barratt
wrote:
> Mike,
>
>> Backup# Type
On Thu, Feb 18, 2010 at 12:04 PM, John Rouillard
wrote:
> On Thu, Feb 18, 2010 at 07:51:13AM -0700, Mike Bydalek wrote:
>> My question is, why did backups 13 and 14 backup all that data? Same
>> with 2 and 7 for that matter.
>
> What level are your incremental backups? if ba
Hello.
Recently I've started using BackupPC to backup my file server and am
seeing some things that just don't quite make much sense. Lately
backups have been taking quite some time, in fact the current one
started on 2/16 @ 11pm and is still running. I do have a lot of data,
around 330G, but no