Re: Multi-Gb dumps using tar + software compression (gzip)?

2004-11-05 Thread Toralf Lund
Toralf Lund wrote: Paul Bijnens wrote: Toralf Lund wrote: Other possible error sources that I think I have eliminated: 1. tar version issues - since gzip complains even if I just uncopress and send the data to /dev/null, or use the -t option. 2. Network transfer issues. I get errors even w

Re: Multi-Gb dumps using tar + software compression (gzip)?

2004-10-20 Thread Eric Siegerman
On Wed, Oct 20, 2004 at 12:52:12PM -0400, Eric Siegerman wrote: > echo $* >/securedirectory/sum$$ & > md5sum >/securedirectory/sum$$ & Oops: the "echo" command shouldn't have an "&". -- | | /\ |-_|/ > Eric Siegerman, Toronto, Ont.[EMAIL PROTECTED] | | / The animal that

Re: Multi-Gb dumps using tar + software compression (gzip)?

2004-10-20 Thread Eric Siegerman
On Wed, Oct 20, 2004 at 01:18:45PM +0200, Toralf Lund wrote: > Other possible error sources that I think I have eliminated: > [ 0. gzip ] > 1. tar version issues [...] > 2. Network transfer issues [...] > 3. Problems with a specific amanda version [...] > 4. Problems with a special disk [

Re: [paul.bijnens@xplanation.com: Re: Multi-Gb dumps using tar + software compression (gzip)?]

2004-10-20 Thread Toralf Lund
from Paul Bijnens <[EMAIL PROTECTED]> - From: Paul Bijnens <[EMAIL PROTECTED]> To: Toralf Lund <[EMAIL PROTECTED]> Cc: Amanda Mailing List <[EMAIL PROTECTED]> Subject: Re: Multi-Gb dumps using tar + software compression (gzip)? Date: Wed, 20 Oct 2004 13:59:31 +0200 Message-ID:

Re: Multi-Gb dumps using tar + software compression (gzip)?

2004-10-20 Thread Toralf Lund
Paul Bijnens wrote: Toralf Lund wrote: Other possible error sources that I think I have eliminated: 1. tar version issues - since gzip complains even if I just uncopress and send the data to /dev/null, or use the -t option. 2. Network transfer issues. I get errors even with server com

Re: Multi-Gb dumps using tar + software compression (gzip)?

2004-10-20 Thread Paul Bijnens
Toralf Lund wrote: Other possible error sources that I think I have eliminated: 1. tar version issues - since gzip complains even if I just uncopress and send the data to /dev/null, or use the -t option. 2. Network transfer issues. I get errors even with server compression, and I'm as

Re: Multi-Gb dumps using tar + software compression (gzip)?

2004-10-20 Thread Toralf Lund
Gene Heskett wrote: On Tuesday 19 October 2004 11:10, Paul Bijnens wrote: Michael Schaller wrote: I found out that this was a problem of my tar. I backed up with GNUTAR and "compress server fast". AMRESTORE restored the file but TAR (on the server!) gave some horrible messages like yours. I

Re: Multi-Gb dumps using tar + software compression (gzip)?

2004-10-20 Thread Toralf Lund
Michael Schaller wrote: Hi Toralf, I'v had nearly the same problem this week. I found out that this was a problem of my tar. I backed up with GNUTAR and "compress server fast". AMRESTORE restored the file but TAR (on the server!) gave some horrible messages like yours. I transferred the file to th

Re: Multi-Gb dumps using tar + software compression (gzip)?

2004-10-19 Thread Gene Heskett
On Tuesday 19 October 2004 11:10, Paul Bijnens wrote: >Michael Schaller wrote: >> I found out that this was a problem of my tar. >> I backed up with GNUTAR and "compress server fast". >> AMRESTORE restored the file but TAR (on the server!) gave some >> horrible messages like yours. >> I transferred

Re: Multi-Gb dumps using tar + software compression (gzip)?

2004-10-19 Thread Paul Bijnens
Michael Schaller wrote: I found out that this was a problem of my tar. I backed up with GNUTAR and "compress server fast". AMRESTORE restored the file but TAR (on the server!) gave some horrible messages like yours. I transferred the file to the original machine ("client") and all worked fine. I

Re: Multi-Gb dumps using tar + software compression (gzip)?

2004-10-19 Thread Toralf Lund
Alexander Jolk wrote: Joshua Baker-LePain wrote: I think that OS and utility (i.e. gnutar and gzip) version info would be useful here as well. True, forgot that. I'm on Linux 2.4.19 (Debian woody), using GNU tar 1.13.25 and gzip 1.3.2. I have never had problems recovering files from huge

Re: Multi-Gb dumps using tar + software compression (gzip)?

2004-10-19 Thread Michael Schaller
Hi Toralf, I'v had nearly the same problem this week. I found out that this was a problem of my tar. I backed up with GNUTAR and "compress server fast". AMRESTORE restored the file but TAR (on the server!) gave some horrible messages like yours. I transferred the file to the original machine ("cli

Re: Multi-Gb dumps using tar + software compression (gzip)?

2004-10-19 Thread Alexander Jolk
Joshua Baker-LePain wrote: > I think that OS and utility (i.e. gnutar and gzip) version info would be > useful here as well. True, forgot that. I'm on Linux 2.4.19 (Debian woody), using GNU tar 1.13.25 and gzip 1.3.2. I have never had problems recovering files from huge dumps. Alex -- Alexand

Re: Multi-Gb dumps using tar + software compression (gzip)?

2004-10-19 Thread Joshua Baker-LePain
On Tue, 19 Oct 2004 at 11:35am, Alexander Jolk wrote > Toralf Lund wrote: > >1. Dumps of directories containing several Gbs of data (up to roughly > > 20Gb compressed in my case.) > >2. Use dumptype GNUTAR. > >3. Compress data using "compress client fast" or "compress server fast

Re: Multi-Gb dumps using tar + software compression (gzip)?

2004-10-19 Thread Toralf Lund
Alexander Jolk wrote: Toralf Lund wrote: 1. Dumps of directories containing several Gbs of data (up to roughly 20Gb compressed in my case.) 2. Use dumptype GNUTAR. 3. Compress data using "compress client fast" or "compress server fast". If you do, what exactly are your amanda.conf set

Re: Multi-Gb dumps using tar + software compression (gzip)?

2004-10-19 Thread Alexander Jolk
Toralf Lund wrote: >1. Dumps of directories containing several Gbs of data (up to roughly > 20Gb compressed in my case.) >2. Use dumptype GNUTAR. >3. Compress data using "compress client fast" or "compress server fast". > > If you do, what exactly are your amanda.conf settings? A

Multi-Gb dumps using tar + software compression (gzip)?

2004-10-19 Thread Toralf Lund
Since I'm still having problems gunzip'ing my large dumps - see separate thread, I was just wondering: Some of you people out there are doing the same kind of thing, right? I mean, have 1. Dumps of directories containing several Gbs of data (up to roughly 20Gb compressed in my case.) 2