Re: [BackupPC-users] Problem with large files - update

2006-01-27 Thread mna.news
Le dimanche 22 Janvier 2006 23:43, Craig Barratt a écrit :
 [...]
 Following up on the list for archive purposes.

 We debugged this offline and the problem is specific to 8GB
 files using smbclient or tar that are multiples of 256 bytes
 in size.  (Unfortunately my 8GB test cases were not multiples
  of 256 in size.)

 The following patch fixes the problem.  I'll roll this into
 the next official patch.

 Craig

 --- BackupPC_tarExtract   2005-12-11 15:52:59.0 -0800
 +++ BackupPC_tarExtract.new   2006-01-12 11:57:58.859704000 -0800
 @@ -102,7 +102,7 @@
  # Copyright 1998 Stephen Zander. All rights reserved.
  #
  my $tar_unpack_header
 -= 'Z100 A8 A8 A8 A12 A12 A8 A1 Z100 A6 A2 Z32 Z32 A8 A8 A155 x12';
 += 'Z100 A8 A8 A8 a12 A12 A8 A1 Z100 A6 A2 Z32 Z32 A8 A8 A155 x12';
  my $tar_header_length = 512;

  my $BufSize  = 1048576; # 1MB or 2^20


I confirm your patch resolve a trouble i have seen last few days on AIX using 
tar method.

aix 5 + tar (GNU tar) 1.15.1
17809556 -rw-r--r--   1 toto  gtoto 18236979200 23 jan 23:15 Backupatoto.tar

this file of 17Go (multiple of 256) is perfectly backup using your patch.

thanks.
Michael.

-- 
Trois choses insupportables :
le café brûlant, le champagne tiède, et les femmes froides.
Orson Welles


---
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems?  Stop!  Download the new AJAX search engine that makes
searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
http://sel.as-us.falkag.net/sel?cmd=lnkkid3432bid#0486dat1642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/


Re: [BackupPC-users] Problem with large files - update

2006-01-22 Thread Craig Barratt
Marko Tukiainen writes:

 I tried sending this message to the list yesterday, but seems that I 
 wasn't subscribed with this exact address, so sorry if this a double 
 post. Anyway, I've added some further info to this question.
 
 I'm having some problems backing up large files (8GB). By some problems 
 I mean that backing up such a file results in a file of size 3904849733 
 bytes and over 100 directories with incomprehensible, random names.
 
 Here's some background:
 
 The BackupPC machine is a 64-bit AMD box, running with a 64-bit version 
 of Ubuntu 5.10 / Breezy. The backup disk is a 1.2 TB RAID5 array, with 
 ReiserFS as the filesystem for BackupPC's data partition.
 
 I'm trying to back up the particular file from a 750 GB TeraStation NAS 
 using the smbclient transport. I also tried this with tar transport 
 combined with a smbfs mount, with the exact same results.
 
 After having quite exhaustively surfed the documentation and forums, 
 I've determined that BackupPC _should_ support files over 8GB in size, 
 as well as new versions of smbclient and tar. I'm also reasonably sure 
 that they do, because I've tried using and copying the file from the NAS 
 to the BackupPC server (onto the BackupPC data partition) with various 
 methods, none of which have failed. These include using smbclient, 
 mount.smbfs and mount.cifs. All of them produced a nice, fully intact 
 10-gigabyte file.
 
 I even tried the exact same command that BackupPC seems to use in the 
 backup process, and it produced no errors. The command I used was:
 
 /usr/bin/env LC_ALL=C /bin/tar -c -v -f - -C /test --totals .  test.tar
 
 (I did this when the remote file system was mounted on /test as smbfs.)
 
 The versions of the software installed on the BackupPC server are:
 backuppc 2.1.1-2ubuntu3
 smbclient 3.0.14a-6ubuntu1
 tar 1.15.1-2
 
 The TeraStation uses samba as well, although an ancient version 
 (2.2.8a). Nevertheless it seems to support storing these large files 
 since they can be used just fine when BackupPC is not involved.
 
 So, any suggestions? I'd hate to change my backup strategy for this, 
 since I've been using backuppc for quite some time now with only a few 
 problems.
 
 UPDATE: Seems that the TeraStation has no impact on the result... I 
 tried backing up the 8GB file from a standard WinXP host using 
 smbclient, and the result was exactly the same: a bunch of mangled 
 directories and a smaller-than-expected file.
 
 However, when BackupPC ran an incremental backup last night, the new 
 backup (with index 1) displays the correct directory structure, without 
 any extra dirs, and the large file, although it displays it's size as 
 8589934592 bytes. There is no such file in the file system though, or 
 more precisely, it's still 3904849733 bytes long and in the original 
 directory (index 0).

Following up on the list for archive purposes.

We debugged this offline and the problem is specific to 8GB
files using smbclient or tar that are multiples of 256 bytes
in size.  (Unfortunately my 8GB test cases were not multiples
 of 256 in size.)

The following patch fixes the problem.  I'll roll this into
the next official patch.

Craig

--- BackupPC_tarExtract 2005-12-11 15:52:59.0 -0800
+++ BackupPC_tarExtract.new 2006-01-12 11:57:58.859704000 -0800
@@ -102,7 +102,7 @@
 # Copyright 1998 Stephen Zander. All rights reserved.
 #
 my $tar_unpack_header
-= 'Z100 A8 A8 A8 A12 A12 A8 A1 Z100 A6 A2 Z32 Z32 A8 A8 A155 x12';
+= 'Z100 A8 A8 A8 a12 A12 A8 A1 Z100 A6 A2 Z32 Z32 A8 A8 A155 x12';
 my $tar_header_length = 512;
 
 my $BufSize  = 1048576; # 1MB or 2^20


---
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems?  Stop!  Download the new AJAX search engine that makes
searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
http://sel.as-us.falkag.net/sel?cmd=lnkkid=103432bid=230486dat=121642
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/