Patrick Michael Kane wrote:

If you restore a dump file to disk someplace and run "file" on it,
what type of file does it tell you it is?


Do you mean a "normal" amrestored'ed file, or a "raw" recovery?

Actually, I have examples of both:

#  file fileserv._scanner2_Hoyde.20041008.6
fileserv._scanner2_Hoyde.20041008.6: GNU tar archive

# file fileserv._scanner2_Hoyde.20041006.0
fileserv._scanner2_Hoyde.20041006.0: AMANDA dump file, DATE 20041006 fileserv /scanner2/Hoy


But of course the output would be what you expected for valid dump files, since they are *mostly* OK. Like I said earlier, tar extract (or list) on the files starts off right, and if I look at the (uncompressed) files starting at the end, I also find valid tar file entries. If looks like the files have section(s) of corrupt data "in the middle", however. I don't know any way to find out exactly where the error occurs, or what is wrong with the data. Or I know where tar gets into trouble for each of the files, of course, but I don't know how to find the corresponding compressed data, or its offset within the dump.

- Toralf



----- Forwarded message from Paul Bijnens <[EMAIL PROTECTED]> -----

From: Paul Bijnens <[EMAIL PROTECTED]>
To: Toralf Lund <[EMAIL PROTECTED]>
Cc: Amanda Mailing List <[EMAIL PROTECTED]>
Subject: Re: Multi-Gb dumps using tar + software compression (gzip)?
Date: Wed, 20 Oct 2004 13:59:31 +0200
Message-ID: <[EMAIL PROTECTED]>
User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.7.1) Gecko/20040707

Toralf Lund wrote:



Other possible error sources that I think I have eliminated:

1. tar version issues - since gzip complains even if I just uncopress
and send the data to /dev/null, or use the -t option.
2. Network transfer issues. I get errors even with server
compression, and I'm assuming gzip would produce consistent output
even if input data were garbled due to network problems.
3. Problems with a specific amanda version. I've tried 2.4.4p1 and
2.4.4p3. Results are the same.
4. Problems with a special disk. I've tested more than one, as target
for "file" dumps as well as holding disk.



5. Hardware errors, e.g. in bad RAM (on a computer without ECC), or disk controller, or cables.

If one single bit is flipped, then gzip produces complete garbage from
that point on. Maybe you're only seeing it in such large backups with gzip, but it happens (less often) in other cases too.
Any tools available to test the hardware?









Reply via email to