Ok I don't know what I did but it just started working...
-Original Message-
From: Chris Adamson [mailto:chris.adam...@mcri.edu.au]
Sent: Tuesday, 5 July 2016 11:12 AM
To: backuppc-users@lists.sourceforge.net
Subject: Re: [BackupPC-users] rrd images not displaying in web inte
In addition, I put a print `/usr/bin/whoami` in to the GeneralInfo.pm file and
in both installations it printed backuppc. But /usr/bin/rrdtool is run as
www-data in the bad installation. I don't get it.
-Original Message-
From: Chris Adamson [mailto:chris.adam...@mcri.edu.au]
I tried this and it didn't work.
-Original Message-
From: karlis [mailto:backuppc-fo...@backupcentral.com]
Sent: Monday, 4 July 2016 8:49 PM
To: backuppc-users@lists.sourceforge.net
Subject: [BackupPC-users] rrd images not displaying in web interface backuppc
3.3.1 +
Apache webserver by
he other fields are identical apart from IP addresses.
Any ideas?
Chris.
[cid:image002.jpg@01D1D609.601E4020]
Chris Adamson
Senior Research Officer
Developmental Imaging
Murdoch Childrens Research Institute
The Royal Children's Hospital
Flemington Rd Parkville, Victoria 3052 AUS
T: (03) 993
I'm assuming that if compression is used then the program does not need to know
the exact level. So why don't you just try to read it as a compressed file, if
that fails try to read it as an uncompressed file, if this fails then you can
throw an error. That way you don't need to read any configu
The idea of using --whole-file is that rsync behaves like tar and I thought
that I could set --whole-file so that the rsync method would behave like the
tar method and compare byte by byte rather than using checksums. I had a look
both in the rsyncp and BackupPC code (RsyncFileIO.pm). The use of
e.
-Original Message-
From: Les Mikesell [mailto:lesmikes...@gmail.com]
Sent: Thursday, 26 September 2013 2:01 PM
To: General list for user discussion, questions and support
Subject: Re: [BackupPC-users] RsyncP and --whole-file
On Tue, Sep 24, 2013 at 6:58 PM, Chris Adamson
wrote:
>
the global compression
level to 0 fixed this issue. This line needs to read the host-specific
compression level.
my $attrib = BackupPC::Attrib->new({ compress => $Conf{CompressLevel} });
Dr Chris Adamson
Research Officer, Developmental Imaging, Murdoch Childrens Research Institute
Murdoch Ch
deal in 0.70.
-Original Message-
From: Les Mikesell [mailto:lesmikes...@gmail.com]
Sent: Wednesday, 25 September 2013 4:03 AM
To: General list for user discussion, questions and support
Subject: Re: [BackupPC-users] RsyncP and --whole-file
On Mon, Sep 23, 2013 at 8:51 PM, Chris Adamson
wro
d of rsync?
On Tue, Sep 24, 2013 at 3:51 AM, Chris Adamson
mailto:chris.adam...@mcri.edu.au>> wrote:
I'm using backuppc to backup > 14TB of data on a local machine. I'm using the
rsyncd option since tar does not detect deleted files in incrementals. I was
trying to see if t
Tue, Sep 24, 2013 at 2:27 PM, Chris Adamson
mailto:chris.adam...@mcri.edu.au>> wrote:
As a workaround for the tar method not detecting deletions on localhost
backups, would it be possible to write a script that is run after each
incremental to perform a file listing on the source directory an
longer in the source directory. I realize this is a workaround and would only
be appropriate for localhost backups but it wouldn't be a bad solution since
incrementals are quick, run when no one is using the system and it would make
the full backups much faster than rsyncd.
Dr Chris Ad
nce it is not transferring over a
network. I had a look in the RsyncP code and it ignores the -whole-file option.
Could this be implemented in RsyncP since the backup rsync processes are very
CPU bound and this slows down backups, particularly full backups.
Chris Adamson.
Dr Chris Adamson
Res
@lists.sourceforge.net
Subject: Re: [BackupPC-users] cpool corruption not fixed by subsequent backups
On 21/08/13 16:16, Chris Adamson wrote:
> I have recently had some weird hardware or software errors on my
> backuppc filesystem. I have fixed the filesystem errors but some of my
> cpool files
I have recently had some weird hardware or software errors on my
backuppc filesystem. I have fixed the filesystem errors but some of my
cpool files have been corrupted. Firstly, during a restore I was getting
Error: padding messages like this one:
Error: padding to 33267797 bytes from 3250
List,
I implemented the tar script option for backing up my localhost as per:
http://backuppc.sourceforge.net/faq/localhost.html
so the "if you are more cautious" option where you create a script that
runs "tar -c $*"
When an incremental is run I get the following warning message in the
log
R
Everyone,
I am running a backuppc installation with a pool around about 3.5TB. The
biggest share in this pool is around 2TB and is located on the machine
itself, so I use the tar method on localhost to back it up. When I do
full backups of this share it takes around 4 days to complete and it
s
Everyone,
I have been using the backuppc 3.0 ubuntu package in hardy for a while
and wish to upgrade to jaunty and use the newer 3.1 version. Are there
any potential problems with just upgrading the distro, installing the
new package and pointing it to the existing directories?
Thanks in advan
18 matches
Mail list logo