Toralf Lund wrote:
Paul Bijnens wrote:
Toralf Lund wrote:
Other possible error sources that I think I have eliminated:
1. tar version issues - since gzip complains even if I just uncopress
and send the data to /dev/null, or use the -t option.
2. Network transfer issues. I get errors even w
On Wed, Oct 20, 2004 at 12:52:12PM -0400, Eric Siegerman wrote:
> echo $* >/securedirectory/sum$$ &
> md5sum >/securedirectory/sum$$ &
Oops: the "echo" command shouldn't have an "&".
--
| | /\
|-_|/ > Eric Siegerman, Toronto, Ont.[EMAIL PROTECTED]
| | /
The animal that
On Wed, Oct 20, 2004 at 01:18:45PM +0200, Toralf Lund wrote:
> Other possible error sources that I think I have eliminated:
> [ 0. gzip ]
> 1. tar version issues [...]
> 2. Network transfer issues [...]
> 3. Problems with a specific amanda version [...]
> 4. Problems with a special disk [
from Paul Bijnens <[EMAIL PROTECTED]> -
From: Paul Bijnens <[EMAIL PROTECTED]>
To: Toralf Lund <[EMAIL PROTECTED]>
Cc: Amanda Mailing List <[EMAIL PROTECTED]>
Subject: Re: Multi-Gb dumps using tar + software compression (gzip)?
Date: Wed, 20 Oct 2004 13:59:31 +0200
Message-ID:
Paul Bijnens wrote:
Toralf Lund wrote:
Other possible error sources that I think I have eliminated:
1. tar version issues - since gzip complains even if I just uncopress
and send the data to /dev/null, or use the -t option.
2. Network transfer issues. I get errors even with server
com
Toralf Lund wrote:
Other possible error sources that I think I have eliminated:
1. tar version issues - since gzip complains even if I just uncopress
and send the data to /dev/null, or use the -t option.
2. Network transfer issues. I get errors even with server
compression, and I'm as
Gene Heskett wrote:
On Tuesday 19 October 2004 11:10, Paul Bijnens wrote:
Michael Schaller wrote:
I found out that this was a problem of my tar.
I backed up with GNUTAR and "compress server fast".
AMRESTORE restored the file but TAR (on the server!) gave some
horrible messages like yours.
I
Michael Schaller wrote:
Hi Toralf,
I'v had nearly the same problem this week.
I found out that this was a problem of my tar.
I backed up with GNUTAR and "compress server fast".
AMRESTORE restored the file but TAR (on the server!) gave some
horrible messages like yours.
I transferred the file to th
On Tuesday 19 October 2004 11:10, Paul Bijnens wrote:
>Michael Schaller wrote:
>> I found out that this was a problem of my tar.
>> I backed up with GNUTAR and "compress server fast".
>> AMRESTORE restored the file but TAR (on the server!) gave some
>> horrible messages like yours.
>> I transferred
Michael Schaller wrote:
I found out that this was a problem of my tar.
I backed up with GNUTAR and "compress server fast".
AMRESTORE restored the file but TAR (on the server!) gave some horrible
messages like yours.
I transferred the file to the original machine ("client") and all worked
fine.
I
Alexander Jolk wrote:
Joshua Baker-LePain wrote:
I think that OS and utility (i.e. gnutar and gzip) version info would be
useful here as well.
True, forgot that. I'm on Linux 2.4.19 (Debian woody), using GNU tar
1.13.25 and gzip 1.3.2. I have never had problems recovering files from
huge
Hi Toralf,
I'v had nearly the same problem this week.
I found out that this was a problem of my tar.
I backed up with GNUTAR and "compress server fast".
AMRESTORE restored the file but TAR (on the server!) gave some horrible
messages like yours.
I transferred the file to the original machine ("cli
Joshua Baker-LePain wrote:
> I think that OS and utility (i.e. gnutar and gzip) version info would be
> useful here as well.
True, forgot that. I'm on Linux 2.4.19 (Debian woody), using GNU tar
1.13.25 and gzip 1.3.2. I have never had problems recovering files from
huge dumps.
Alex
--
Alexand
On Tue, 19 Oct 2004 at 11:35am, Alexander Jolk wrote
> Toralf Lund wrote:
> >1. Dumps of directories containing several Gbs of data (up to roughly
> > 20Gb compressed in my case.)
> >2. Use dumptype GNUTAR.
> >3. Compress data using "compress client fast" or "compress server fast
Alexander Jolk wrote:
Toralf Lund wrote:
1. Dumps of directories containing several Gbs of data (up to roughly
20Gb compressed in my case.)
2. Use dumptype GNUTAR.
3. Compress data using "compress client fast" or "compress server fast".
If you do, what exactly are your amanda.conf set
Toralf Lund wrote:
>1. Dumps of directories containing several Gbs of data (up to roughly
> 20Gb compressed in my case.)
>2. Use dumptype GNUTAR.
>3. Compress data using "compress client fast" or "compress server fast".
>
> If you do, what exactly are your amanda.conf settings? A
Since I'm still having problems gunzip'ing my large dumps - see separate
thread, I was just wondering:
Some of you people out there are doing the same kind of thing, right? I
mean, have
1. Dumps of directories containing several Gbs of data (up to roughly
20Gb compressed in my case.)
2
17 matches
Mail list logo