Re: [BackupPC-users] Backup Data Volumes

2010-06-14 Thread Norbert Hoeller
Below are the rsync options - I do not recall making any changes from the 
defaults.

rsync --server --sender --numeric-ids --perms --owner --group -D --links 
--hard-links --times --block-size=2048 --recursive . /var//

Aside from backing up a symbolic link rather than the full (and rather 
long) directory path that I used on the old web server, another difference 
is the new backup server architecture - it is a 'plug computer' with an 
ARM processor running Ubuntu 9.04.
Thanks, Norbert

--
ThinkGeek and WIRED's GeekDad team up for the Ultimate 
GeekDad Father's Day Giveaway. ONE MASSIVE PRIZE to the 
lucky parental unit.  See the prize list and enter to win: 
http://p.sf.net/sfu/thinkgeek-promo___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup Data Volumes

2010-06-14 Thread Norbert Hoeller
I discontinued backup of my old web server this weekend and upgraded rsync 
on the new web server to 3.0.5 to be compatible with the backuppc server. 
This morning, backup traffic was close to 450MB.  I did one full backup 
(existing files 1492/14MB, new files 12/0MB) and three incrementals 
(existing files 3826/411MB, new files 778/27MB). 

The traffic pattern suggests that one of the incremental backups (existing 
files 3734/411MB, new files 664/21MB) accounted for the bulk of the 
traffic.  I had migrated multiple MediaWiki instances over the weekend, 
all using an identical code base.  One MediaWiki instance had been backed 
up last week.  Although the file counts and aggregate data is 
considerable, I would have expected rsync to detect that the files had 
already been stored in the backuppc server and would not have transferred 
the files.  The data volumes would suggest otherwise.

Am I missing something obvious?
Thanks, Norbert

--
ThinkGeek and WIRED's GeekDad team up for the Ultimate 
GeekDad Father's Day Giveaway. ONE MASSIVE PRIZE to the 
lucky parental unit.  See the prize list and enter to win: 
http://p.sf.net/sfu/thinkgeek-promo___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] rsync restore skipping non-regular file ...

2010-06-14 Thread Ralf Gross
Hi,

I switched from tar to rsync a few weeks ago. Now I hat to restore the
first file from an older backup (before the tar -> rsync switch).

I get the following messages in the xfer log:

Running: /usr/bin/ssh -c blowfish -q -x -l root vu0em003-1
/usr/bin/rsync --server --numeric-ids --perms --owner --group -D
--links --hard-links --times --block-size=2048 --relative
--ignore-times --recursive --checksum-seed=32761 . /tmp/
Xfer PIDs are now 27459
Got remote protocol 30
Negotiated protocol version 28
Checksum caching enabled (checksumSeed = 32761)
Got checksumSeed 0x7ff9
Sending /server/projekte/path/to/file...  file.xls (remote=/file.xls) type = 0
  restore   770 50872/1095 1789952 /tmp/file.xls
Remote[2]: skipping non-regular file "file.xls"
Finished csumReceive
Finished csumReceive
Done: 1 files, 1789952 bytes


backuppc show a status of success for the restore, but no file was
restored to /tmp.

I can successfully download the file as zip archive or by just
clicking on the file in the tree view.

I can also restore the same file with rsync when it is in a backup
that was done more recently with rsync (after the switch from tar to
rsync).

Is this a known problem? Is there anything I can do to restore these
older files with rsync (besides switching to tar as restore method).

Ralf

--
ThinkGeek and WIRED's GeekDad team up for the Ultimate 
GeekDad Father's Day Giveaway. ONE MASSIVE PRIZE to the 
lucky parental unit.  See the prize list and enter to win: 
http://p.sf.net/sfu/thinkgeek-promo
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Retrieve the proper incremental level

2010-06-14 Thread Inno
Hello,

I use incremental level (1,2,3,4 correspond to Monday, Tuesday, Wednesday, 
Thursday). But "Wednesday" and "Thursday" have bugged last week.It stopped at 
level 2. 
Am I required to reactivate two incremental to retrieve the proper level?

Thanks.



 Faites le plein d'idées cadeaux, sorties ou recettes pour la fête des pères 
sur http://actu.voila.fr/evenementiel/fete-des-peres2010/




--
ThinkGeek and WIRED's GeekDad team up for the Ultimate 
GeekDad Father's Day Giveaway. ONE MASSIVE PRIZE to the 
lucky parental unit.  See the prize list and enter to win: 
http://p.sf.net/sfu/thinkgeek-promo
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC can't find Compress::Zlib after recent update on Centos 5

2010-06-14 Thread Pete Geenhuizen
Turns out it's a problem with List::Util.  The perl-Scalar-List-Utils 
rpm conflicts with perl-5.8.8-32.el5_5.1.i386 rpm.  Upgrading List::Util 
from 1.19 to 1.23 from CPAN solved the problem.


Pete Geenhuizen wrote:
> Oops, I missed the fact that both BackupPC and Perl were updated, so the 
> issue I suppose is with Perl and the Zlib rpms, and not BackupPC. Any 
> clues as to how to set about resolving this?
>
> Pete Geenhuizen wrote:
>   
>> This morning I got an update for BackupPC to BackupPC-3.1.0-6.el5, and 
>> now it complains that it can't find Compress::Zlib.
>>
>> I've experienced this problem in the past and was able to find the 
>> necessary fix, i.e. use rpms rather than CPAN modules, which I did and 
>> the problems was solved, so I'm a bit surprised that this has cropped up 
>> again and don't know what to do next.
>>
>> I have the following RPMs installed
>> perl-IO-Compress-2.024-1.el5.rf
>> perl-Compress-Raw-Zlib-2.024-1.el5.rf
>> perl-Compress-Raw-Bzip2-2.024-1.el5.rf
>>
>> Not sure what other information to provide at this point.
>>
>> Pete
>>
>>   
>> 
>
>   

-- 
Unencumbered by the thought process.  
 -- Click and Clack the Tappet brothers 


--
ThinkGeek and WIRED's GeekDad team up for the Ultimate 
GeekDad Father's Day Giveaway. ONE MASSIVE PRIZE to the 
lucky parental unit.  See the prize list and enter to win: 
http://p.sf.net/sfu/thinkgeek-promo
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC can't find Compress::Zlib after recent update on Centos 5

2010-06-14 Thread Jean-Michel Beuken
Hello,

this is my working config  with BackupPC 3.1.0 :

[backu...@backup10 html]$ cat /etc/redhat-release
CentOS release 5.5 (Final)

[backu...@backup10 html]$ rpm -qa | grep -i zlib
zlib-devel-1.2.3-3
zlib-1.2.3-3
zlib-devel-1.2.3-3
perl-IO-Zlib-1.04-4.2.1
perl-Compress-Zlib-1.42-1.fc6
zlib-1.2.3-3


regards

jmb

On 14/06/2010 01:49, Pete Geenhuizen wrote:
> This morning I got an update for BackupPC to BackupPC-3.1.0-6.el5, and
> now it complains that it can't find Compress::Zlib.
>
> I've experienced this problem in the past and was able to find the
> necessary fix, i.e. use rpms rather than CPAN modules, which I did and
> the problems was solved, so I'm a bit surprised that this has cropped up
> again and don't know what to do next.
>
> I have the following RPMs installed
> perl-IO-Compress-2.024-1.el5.rf
> perl-Compress-Raw-Zlib-2.024-1.el5.rf
> perl-Compress-Raw-Bzip2-2.024-1.el5.rf
>
> Not sure what other information to provide at this point.
>
> Pete
>
>



--
ThinkGeek and WIRED's GeekDad team up for the Ultimate 
GeekDad Father's Day Giveaway. ONE MASSIVE PRIZE to the 
lucky parental unit.  See the prize list and enter to win: 
http://p.sf.net/sfu/thinkgeek-promo
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/