Re: ERROR: /usr/bin/gzip exited with status 1

2011-04-17 Thread Gour-Gadadhara Dasa
On Sun, 17 Apr 2011 13:54:28 -0400 Jean-Louis Martineau wrote: > amrestore -r Thank you. Then it means that before concatenate-ing all the parts together, we need to skip first 32k block? Sincerely, Gour -- “In the material world, conceptions of good and bad are all mental speculations…” (

Re: ERROR: /usr/bin/gzip exited with status 1

2011-04-17 Thread Jean-Louis Martineau
d and now amrestore reports: amrestore on fbsd on compressed .gz tape (on linux) failed with: ERROR: /usr/bin/gzip exited with status 1. Any hint how to overcome it? (It's multi-tapes backup and I need to amrestore first and then to concatenate all the parts.) Sincerely, Gour

ERROR: /usr/bin/gzip exited with status 1

2011-04-17 Thread Gour-Gadadhara Dasa
sd on compressed .gz tape (on linux) failed with: ERROR: /usr/bin/gzip exited with status 1. Any hint how to overcome it? (It's multi-tapes backup and I need to amrestore first and then to concatenate all the parts.) Sincerely, Gour -- “In the material world, conceptions of good and bad are

Re: SV: FAILED [spawn /bin/gzip: dup2 out: Bad file descriptor] on 2.6.1p2

2010-05-03 Thread Volker Pallas
Hi, just wanted to send you an update on this issue. Switching to auth=bsdtcp completely solved my problem. The working line from /etc/inetd.conf (for openbsd-inetd, and the amanda-user being "backup") is: amanda stream tcp nowait backup /usr/lib/amanda/amandad amandad -auth=bsdtcp amdump amindex

Re: SV: FAILED [spawn /bin/gzip: dup2 out: Bad file descriptor] on 2.6.1p2

2010-04-21 Thread Dustin J. Mitchell
On Wed, Apr 21, 2010 at 11:04 AM, Volker Pallas wrote: > Is auth=bsdtcp mandatory? If you want to switch to bsdtcp, then yes. You'll also need to change your (x)inetd configuration accordingly. The amanda-auth(7) manpage may be of use to you in figuring the whole thing out. DUstin -- Open So

Re: SV: FAILED [spawn /bin/gzip: dup2 out: Bad file descriptor] on 2.6.1p2

2010-04-21 Thread Volker Pallas
ry? Thanks you, Volker Volker Pallas wrote: > Gunnarsson, Gunnar wrote: > >> Switching to tcp instead of using udp cured those problems. >> >>> Hi, >>> >>> I'm having a bit of a problem on *some* servers concerning failed >>> b

Re: SV: FAILED [spawn /bin/gzip: dup2 out: Bad file descriptor] on 2.6.1p2

2010-04-16 Thread Volker Pallas
ose problems. >> Hi, >> >> I'm having a bit of a problem on *some* servers concerning failed >> backups with the error message: >> lev # FAILED [spawn /bin/gzip: dup2 out: Bad file descriptor] >> > > Gunnar had a similar problem - maybe his exper

SV: FAILED [spawn /bin/gzip: dup2 out: Bad file descriptor] on 2.6.1p2

2010-04-13 Thread Gunnarsson, Gunnar
: FAILED [spawn /bin/gzip: dup2 out: Bad file descriptor] on 2.6.1p2 On Mon, Apr 12, 2010 at 4:48 AM, Volker Pallas wrote: >  Hi, > > I'm having a bit of a problem on *some* servers concerning failed > backups with the error message: > lev # FAILED [spawn /bin/gzip: dup2 out:

Re: FAILED [spawn /bin/gzip: dup2 out: Bad file descriptor] on 2.6.1p2

2010-04-13 Thread Dustin J. Mitchell
On Mon, Apr 12, 2010 at 4:48 AM, Volker Pallas wrote: >  Hi, > > I'm having a bit of a problem on *some* servers concerning failed > backups with the error message: > lev # FAILED [spawn /bin/gzip: dup2 out: Bad file descriptor] Gunnar had a similar problem - maybe his

FAILED [spawn /bin/gzip: dup2 out: Bad file descriptor] on 2.6.1p2

2010-04-12 Thread Volker Pallas
Hi, I'm having a bit of a problem on *some* servers concerning failed backups with the error message: lev # FAILED [spawn /bin/gzip: dup2 out: Bad file descriptor] usually these failed backups are "successfully retried", but sometimes I get the same error twice and the backup fo

Re: sendbackup: error [spawn /opt/csw/bin/gzip: dup2 err: Bad file number]

2009-04-23 Thread Dustin J. Mitchell
On Thu, Apr 23, 2009 at 3:18 PM, Darin Perusich wrote: > In my continued testing of amsuntar I am intermittently seeing this > "/opt/csw/bin/gzip: dup2 err: Bad file number" error during amdump. > While it appears to be random I have seen this occur with certain > parti

sendbackup: error [spawn /opt/csw/bin/gzip: dup2 err: Bad file number]

2009-04-23 Thread Darin Perusich
In my continued testing of amsuntar I am intermittently seeing this "/opt/csw/bin/gzip: dup2 err: Bad file number" error during amdump. While it appears to be random I have seen this occur with certain partitions more then others, I've been changing up the disklist to try and recrea

Re: performance tuning, gzip

2008-07-28 Thread Ian Turner
Good luck at your new place. How do you like it? Was/is it hard to move your consulting practice so far? --Ian On Saturday 26 July 2008 23:17:44 Jon LaBadie wrote: > On Sat, Jul 26, 2008 at 10:55:56PM -0400, Ian Turner wrote: > > Jon, > > > > I thought you were in Princeton. Did you move? > > >

Re: performance tuning, gzip

2008-07-26 Thread Jon LaBadie
On Sat, Jul 26, 2008 at 10:55:56PM -0400, Ian Turner wrote: > Jon, > > I thought you were in Princeton. Did you move? > > --Ian > Yes I did Ian. Moved at the beginning of the year to Reston, which for those unfamiliar with the area is about 20 miles west of Washington, D.C. Jon -- Jon H. LaB

Re: performance tuning, gzip

2008-07-26 Thread Ian Turner
urse ramped up > > the load on the clients even further. > > > > The question of which version of Gzip to run arose, we had a fairly > > old version and there is a newer-Sun version available, just didn't > > know how version sensitive we where. I know version of gzip (

Re: performance tuning, gzip

2008-07-25 Thread Jon LaBadie
as smaller, and it was taking > longer to get to us). > > So I increased the inparallel parameter, which of course ramped up > the load on the clients even further. > > The question of which version of Gzip to run arose, we had a fairly > old version and there is a newer-Sun

Re: performance tuning, gzip

2008-07-25 Thread Brian Cuttler
as smaller, and it was taking > longer to get to us). > > So I increased the inparallel parameter, which of course ramped up > the load on the clients even further. > > The question of which version of Gzip to run arose, we had a fairly > old version and there is a newer-Sun

performance tuning, gzip

2008-07-25 Thread Brian Cuttler
p the load on the clients even further. The question of which version of Gzip to run arose, we had a fairly old version and there is a newer-Sun version available, just didn't know how version sensitive we where. I know version of gzip (which we use on some partitons on these clients) is ve

Re: hardware gzip accelerator cards

2006-03-13 Thread Paul Bijnens
On 2006-03-11 14:17, Kai Zimmer wrote: Hi all, has anybody on the list experience with hardware gzip accelerator cards (e.g. form indranetworks)? Are they of any use for amanda - or is the disk-i/o the limiting factor? And how much are those (generally pci-based) cards? Never heard, and

Re: hardware gzip accelerator cards

2006-03-11 Thread Michael Loftis
--On March 11, 2006 2:17:50 PM +0100 Kai Zimmer <[EMAIL PROTECTED]> wrote: Hi all, has anybody on the list experience with hardware gzip accelerator cards (e.g. form indranetworks)? Are they of any use for amanda - or is the disk-i/o the limiting factor? And how much are those (general

Re: hardware gzip accelerator cards

2006-03-11 Thread Jon LaBadie
On Sat, Mar 11, 2006 at 02:17:50PM +0100, Kai Zimmer wrote: > Hi all, > > has anybody on the list experience with hardware gzip accelerator cards > (e.g. form indranetworks)? Are they of any use for amanda - or is the > disk-i/o the limiting factor? And how much are those (g

hardware gzip accelerator cards

2006-03-11 Thread Kai Zimmer
Hi all, has anybody on the list experience with hardware gzip accelerator cards (e.g. form indranetworks)? Are they of any use for amanda - or is the disk-i/o the limiting factor? And how much are those (generally pci-based) cards? thanks, Kai

Re: gzip trailing garbage

2006-02-21 Thread Kevin Till
Greg Troxel wrote: I'm using 2.4.5p1 on NetBSD with Kerberos encryption and authentication. I tried to verify some tapes and found that 'gzip -t' failed on the restored files. On investigation, after adding some better diagnostics to gzip (NetBSD's own), I found that the

gzip trailing garbage

2006-02-21 Thread Greg Troxel
I'm using 2.4.5p1 on NetBSD with Kerberos encryption and authentication. I tried to verify some tapes and found that 'gzip -t' failed on the restored files. On investigation, after adding some better diagnostics to gzip (NetBSD's own), I found that the problem was that th

warnings from NetBSD gzip about > 4GB saved files

2005-11-02 Thread Greg Troxel
NetBSD's gzip currently warns about output files > 4 GB, because the gzip format can't store such lengths. Also, it sets the exit status to 1 and prints EOPNOTSUPP, which is just plain wrong. I'm discussing how to fix this with other NetBSD people. I think the real issue is w

Re: multiple gzip on same data!?

2005-06-29 Thread Graeme Humphries
On Wed, 2005-06-29 at 11:12 -0600, Michael Loftis wrote: > Then do client side compression? Is there really a reason as to why you're > not? Client side compression gives me around 3-4 MB / sec data transfers. Server side gives me around 10-15 MB / sec (with the current CPU in the AMANDA server)

Re: multiple gzip on same data!?

2005-06-29 Thread Graeme Humphries
On Wed, 2005-06-29 at 13:18 -0400, Jon LaBadie wrote: > "fast" rather than "best" might make a big difference Oh, can you specify compress-fast as well as srvcompress? That definitely would help. > Wishlist item: allow for compress "normal" as well as best and fast. > It often strikes a good bal

Re: multiple gzip on same data!?

2005-06-29 Thread Jon LaBadie
On Wed, Jun 29, 2005 at 10:58:07AM -0600, Graeme Humphries wrote: > On Wed, 2005-06-29 at 10:18 -0600, Michael Loftis wrote: > > Nope it isn't. One is for the index, one for the data. I had the same > > 'huh?!' question (sort of) a while back since I do client side compression > > and still had

Re: multiple gzip on same data!?

2005-06-29 Thread Michael Loftis
--On June 29, 2005 10:58:07 AM -0600 Graeme Humphries <[EMAIL PROTECTED]> wrote: Ahhh, that makes sense then. Alright, I've got to beef up my AMANDA server, because it's struggling along with just those 4 gzips, and I want to have 4 dumpers going simultaneously all the time. Then do client

Re: multiple gzip on same data!?

2005-06-29 Thread Graeme Humphries
On Wed, 2005-06-29 at 10:18 -0600, Michael Loftis wrote: > Nope it isn't. One is for the index, one for the data. I had the same > 'huh?!' question (sort of) a while back since I do client side compression > and still had gzip's running ;) Ahhh, that makes sense then. Alright, I've got to beef

Re: multiple gzip on same data!?

2005-06-29 Thread Jon LaBadie
; 9685 ?S 0:01 \_ /usr/lib/amanda/driver weekly > 9686 ?S 4:24 \_ taper weekly > 9687 ?S 0:59 | \_ taper weekly > 9699 ?S 9:45 \_ dumper0 weekly > 10629 ?S

Re: multiple gzip on same data!?

2005-06-29 Thread Michael Loftis
--On June 29, 2005 9:57:48 AM -0600 Graeme Humphries <[EMAIL PROTECTED]> wrote: Now, why oh why is it doing *two* gzip operations on each set of data!? It looks like the gzip --best isn't actually getting that much running time, so is there something going on here that's fa

multiple gzip on same data!?

2005-06-29 Thread Graeme Humphries
kly 9687 ?S 0:59 | \_ taper weekly 9699 ?S 9:45 \_ dumper0 weekly 10629 ?S 96:19 | \_ /bin/gzip --fast 10630 ?S 0:00 | \_ /bin/gzip --best 9700 ?S 6:52 \_ dumper1

gzip wrapper script for encrypted backups and restore not working

2005-03-31 Thread Oscar Ricardo Silva
ive using dd off the tape then run the gzip wrapper script, I now have a dump or a tar archive. I've looked through the list archives and others appeared to have this same problem but I didn't see a solution. I've changed the redirect in the script from: ${gzip_prog} ${gzip_f

Re: Multi-Gb dumps using tar + software compression (gzip)?

2004-11-05 Thread Toralf Lund
Toralf Lund wrote: Paul Bijnens wrote: Toralf Lund wrote: Other possible error sources that I think I have eliminated: 1. tar version issues - since gzip complains even if I just uncopress and send the data to /dev/null, or use the -t option. 2. Network transfer issues. I get errors even

Re: Unexpected 'gzip --best' processes

2004-10-21 Thread Toralf Lund
Joshua Baker-LePain wrote: On Thu, 21 Oct 2004 at 6:19pm, Toralf Lund wrote This may be related to our backup problems described earlier: I just noticed that during a dump running just now, I have # ps -f -C gzip UIDPID PPID C STIME TTY TIME CMD amanda3064 769 0 17:18

Re: Unexpected 'gzip --best' processes

2004-10-21 Thread Joshua Baker-LePain
On Thu, 21 Oct 2004 at 6:19pm, Toralf Lund wrote > This may be related to our backup problems described earlier: > > I just noticed that during a dump running just now, I have > > # ps -f -C gzip > UIDPID PPID C STIME TTY TIME CMD > amanda3064 769

Unexpected 'gzip --best' processes

2004-10-21 Thread Toralf Lund
This may be related to our backup problems described earlier: I just noticed that during a dump running just now, I have # ps -f -C gzip UIDPID PPID C STIME TTY TIME CMD amanda3064 769 0 17:18 pts/500:00:00 /bin/gzip --best amanda3129 773 0 17:44 pts/500:00

Re: Multi-Gb dumps using tar + software compression (gzip)?

2004-10-20 Thread Eric Siegerman
On Wed, Oct 20, 2004 at 12:52:12PM -0400, Eric Siegerman wrote: > echo $* >/securedirectory/sum$$ & > md5sum >/securedirectory/sum$$ & Oops: the "echo" command shouldn't have an "&". -- | | /\ |-_|/ > Eric Siegerman, Toronto, Ont.[EMAIL PROTECTED] | | / The animal that

Re: Multi-Gb dumps using tar + software compression (gzip)?

2004-10-20 Thread Eric Siegerman
On Wed, Oct 20, 2004 at 01:18:45PM +0200, Toralf Lund wrote: > Other possible error sources that I think I have eliminated: > [ 0. gzip ] > 1. tar version issues [...] > 2. Network transfer issues [...] > 3. Problems with a specific amanda version [...] > 4. Problems w

Re: [paul.bijnens@xplanation.com: Re: Multi-Gb dumps using tar + software compression (gzip)?]

2004-10-20 Thread Toralf Lund
from Paul Bijnens <[EMAIL PROTECTED]> - From: Paul Bijnens <[EMAIL PROTECTED]> To: Toralf Lund <[EMAIL PROTECTED]> Cc: Amanda Mailing List <[EMAIL PROTECTED]> Subject: Re: Multi-Gb dumps using tar + software compression (gzip)? Date: Wed, 20 Oct 2004 13:59:31 +0200 Message-ID:

Re: Multi-Gb dumps using tar + software compression (gzip)?

2004-10-20 Thread Toralf Lund
Paul Bijnens wrote: Toralf Lund wrote: Other possible error sources that I think I have eliminated: 1. tar version issues - since gzip complains even if I just uncopress and send the data to /dev/null, or use the -t option. 2. Network transfer issues. I get errors even with server

Re: Multi-Gb dumps using tar + software compression (gzip)?

2004-10-20 Thread Paul Bijnens
Toralf Lund wrote: Other possible error sources that I think I have eliminated: 1. tar version issues - since gzip complains even if I just uncopress and send the data to /dev/null, or use the -t option. 2. Network transfer issues. I get errors even with server compression, and I&#

Re: Multi-Gb dumps using tar + software compression (gzip)?

2004-10-20 Thread Toralf Lund
ything but itself. (and I'm not sure that 1.13 could even recover its own output!) I hate to be boreing and repetitive, but there are those here *now* who did not go thru that period of hair removal that 1.13 caused. Yep. But how about gzip? Any known issues there? I think I've rul

Re: Multi-Gb dumps using tar + software compression (gzip)?

2004-10-20 Thread Toralf Lund
#x27;ve tried both. In fact, I've tested just about every combination of tar, gzip, filesystems, hosts, recovery sources (tape, disk dump, holding disk...) etc. I could think of, and I always get the same result. I'm thinking this can't possibly be a tar problem, though, or at leas

Re: Multi-Gb dumps using tar + software compression (gzip)?

2004-10-19 Thread Gene Heskett
On Tuesday 19 October 2004 11:10, Paul Bijnens wrote: >Michael Schaller wrote: >> I found out that this was a problem of my tar. >> I backed up with GNUTAR and "compress server fast". >> AMRESTORE restored the file but TAR (on the server!) gave some >> horrible messages like yours. >> I transferred

Re: Multi-Gb dumps using tar + software compression (gzip)?

2004-10-19 Thread Paul Bijnens
Michael Schaller wrote: I found out that this was a problem of my tar. I backed up with GNUTAR and "compress server fast". AMRESTORE restored the file but TAR (on the server!) gave some horrible messages like yours. I transferred the file to the original machine ("client") and all worked fine. I

Re: Multi-Gb dumps using tar + software compression (gzip)?

2004-10-19 Thread Toralf Lund
Alexander Jolk wrote: Joshua Baker-LePain wrote: I think that OS and utility (i.e. gnutar and gzip) version info would be useful here as well. True, forgot that. I'm on Linux 2.4.19 (Debian woody), using GNU tar 1.13.25 and gzip 1.3.2. I have never had problems recovering files from

Re: Multi-Gb dumps using tar + software compression (gzip)?

2004-10-19 Thread Michael Schaller
Hi Toralf, I'v had nearly the same problem this week. I found out that this was a problem of my tar. I backed up with GNUTAR and "compress server fast". AMRESTORE restored the file but TAR (on the server!) gave some horrible messages like yours. I transferred the file to the original machine ("cli

Re: Multi-Gb dumps using tar + software compression (gzip)?

2004-10-19 Thread Alexander Jolk
Joshua Baker-LePain wrote: > I think that OS and utility (i.e. gnutar and gzip) version info would be > useful here as well. True, forgot that. I'm on Linux 2.4.19 (Debian woody), using GNU tar 1.13.25 and gzip 1.3.2. I have never had problems recovering files from huge d

Re: Multi-Gb dumps using tar + software compression (gzip)?

2004-10-19 Thread Joshua Baker-LePain
nce when the tape was failing. I'll send you my amanda.conf > privately. BTW which version are you using? I'm at version > 2.4.4p1-20030716. I think that OS and utility (i.e. gnutar and gzip) version info would be useful here as well. -- Joshua Baker-LePain Department of Biomedical Engineering Duke University

Re: Multi-Gb dumps using tar + software compression (gzip)?

2004-10-19 Thread Toralf Lund
Alexander Jolk wrote: Toralf Lund wrote: 1. Dumps of directories containing several Gbs of data (up to roughly 20Gb compressed in my case.) 2. Use dumptype GNUTAR. 3. Compress data using "compress client fast" or "compress server fast". If you do, what exactly are your amanda.conf set

Re: Multi-Gb dumps using tar + software compression (gzip)?

2004-10-19 Thread Alexander Jolk
Toralf Lund wrote: >1. Dumps of directories containing several Gbs of data (up to roughly > 20Gb compressed in my case.) >2. Use dumptype GNUTAR. >3. Compress data using "compress client fast" or "compress server fast". > > If you do, what exactly are your amanda.conf settings? A

Multi-Gb dumps using tar + software compression (gzip)?

2004-10-19 Thread Toralf Lund
Since I'm still having problems gunzip'ing my large dumps - see separate thread, I was just wondering: Some of you people out there are doing the same kind of thing, right? I mean, have 1. Dumps of directories containing several Gbs of data (up to roughly 20Gb compressed in my case.) 2

Re: tar/gzip problems on restore (CRC error, "Archive contains obsolescentbase-64 headers"...)

2004-10-15 Thread Toralf Lund
to suspect that gzip itself is causing the problem. Any known issues, there? The client in question does have a fairly old version, 1.2.4, That rings a bell somewhere. Hasn't there been once a report on this list from someone whose zipped backups got corrupted at every (other) GB mark?

Re: tar/gzip problems on restore (CRC error, "Archive contains obsolescentbase-64 headers"...)

2004-10-14 Thread Toralf Lund
Alexander Jolk wrote: Toralf Lund wrote: [...] I get the same kind of problem with harddisk dumps as well as tapes, and as it now turns out, also for holding disk files. And the disks and tape drive involved aren't even on the same chain. Actually, I'm starting to suspect that gzip

Re: tar/gzip problems on restore (CRC error, "Archive contains obsolescentbase-64 headers"...)

2004-10-14 Thread Toralf Lund
s. I do seem to remember that I took care to make sure it wouldn't be used, when I installed Amanda. I've installed the freeware version a while ago (GNU tar) 1.13.25 without an itch along with /usr/sbin/gzip. Both incarnations of gzip return the same version string as the one you included

Re: tar/gzip problems on restore (CRC error, "Archive contains obsolescentbase-64 headers"...)

2004-10-14 Thread Geert Uytterhoeven
On Thu, 14 Oct 2004, Toralf Lund wrote: > Gene Heskett wrote: > > Also, the gzip here is 1.3.3, dated in 2002. There may have been fixes to > > it, probably in the >2GB file sizes areas. > > > Ahem. If >2GB data is or has been a problem, then I'm definitely d

Re: tar/gzip problems on restore (CRC error, "Archive contains obsolescentbase-64 headers"...)

2004-10-14 Thread Toralf Lund
Gene Heskett wrote: On Wednesday 13 October 2004 11:07, Toralf Lund wrote: Jean-Francois Malouin wrote: [ snip ] Actually, I'm starting to suspect that gzip itself is causing the problem. Any known issues, there? The client in question does have a fairly old version, 1.2.4, I

Re: tar/gzip problems on restore (CRC error, "Archive contains obsolescentbase-64 headers"...)

2004-10-13 Thread Gene Heskett
On Wednesday 13 October 2004 11:07, Toralf Lund wrote: >Jean-Francois Malouin wrote: >> [ snip ] >> >>>Actually, I'm starting to suspect that gzip itself is causing the >>>problem. Any known issues, there? The client in question does have >>> a fairly

Re: tar/gzip problems on restore (CRC error, "Archive contains obsolescentbase-64 headers"...)

2004-10-13 Thread Toralf Lund
Jean-Francois Malouin wrote: [ snip ] Actually, I'm starting to suspect that gzip itself is causing the problem. Any known issues, there? The client in question does have a fairly old version, 1.2.4, I think (that's the latest one supplied by SGI, unless they have upgraded it ver

Re: tar/gzip problems on restore (CRC error, "Archive contains obsolescentbase-64 headers"...)

2004-10-13 Thread Jean-Francois Malouin
* Toralf Lund <[EMAIL PROTECTED]> [20041013 09:43]: > Alexander Jolk wrote: > > >Toralf Lund wrote: > > > > > >>tar: Skipping to next header > >>tar: Archive contains obsolescent base-64 headers > >>37800+0 records in > >>3780

Re: tar/gzip problems on restore (CRC error, "Archive contains obsolescentbase-64 headers"...)

2004-10-13 Thread Toralf Lund
Alexander Jolk wrote: Toralf Lund wrote: tar: Skipping to next header tar: Archive contains obsolescent base-64 headers 37800+0 records in 37800+0 records out gzip: stdin: invalid compressed data--crc error tar: Child returned status 1 tar: Error exit delayed from previous errors I'v

Re: tar/gzip problems on restore (CRC error, "Archive contains obsolescentbase-64 headers"...)

2004-10-13 Thread Alexander Jolk
Toralf Lund wrote: > tar: Skipping to next header > tar: Archive contains obsolescent base-64 headers > 37800+0 records in > 37800+0 records out > > gzip: stdin: invalid compressed data--crc error > tar: Child returned status 1 > tar: Error exit delayed from previous e

tar/gzip problems on restore (CRC error, "Archive contains obsolescent base-64 headers"...)

2004-10-13 Thread Toralf Lund
acking the harddisk dump in a more direct manner: # dd if=/dumps/mirror/d4/data/00013.fileserv._scanner2_Hoyde.6 bs=32k skip=1 | tar -xvpzkf - [ file extract info skipped ] tar: Skipping to next header tar: Archive contains obsolescent base-64 headers 37800+0 records in 37800+0 records out gzip: stdi

Re: Compression using gzip

2004-08-27 Thread Paul Bijnens
Kaushal Shriyan wrote: I wanted to enable gzip compression on my backups since i want to accomodate lots of data on to a 40GB tape, At present I am using define dumptype root-tar { global program "GNUTAR" comment "root partitions dumped with tar" com

Re: Compression using gzip

2004-08-27 Thread Martin Hepworth
wrote: Hi I wanted to enable gzip compression on my backups since i want to accomodate lots of data on to a 40GB tape, At present I am using define dumptype root-tar { global program "GNUTAR" comment "root partitions dumped with tar" compress none index

Compression using gzip

2004-08-27 Thread Kaushal Shriyan
Hi I wanted to enable gzip compression on my backups since i want to accomodate lots of data on to a 40GB tape, At present I am using define dumptype root-tar { global program "GNUTAR" comment "root partitions dumped with tar" compress none index

Re: Question about gzip on the server

2003-11-13 Thread Paul Bijnens
Dana Bourgeois wrote: OK, so all my clients are compressing. I have 13 clients and about 5 of them are Solaris using dump, the rest are using tar. Could someone explain why the dumpers are also spawning a 'gzip --best' process? They only use 5 or 6 seconds of CPU so they are not doin

Question about gzip on the server

2003-11-12 Thread Dana Bourgeois
OK, so all my clients are compressing. I have 13 clients and about 5 of them are Solaris using dump, the rest are using tar. Could someone explain why the dumpers are also spawning a 'gzip --best' process? They only use 5 or 6 seconds of CPU so they are not doing much but I don'

Re: amanda + gzip errors on debian?

2003-07-15 Thread C. Chan
Just a note that I have experienced a similar problem, but with Redhat and Mandrake rather than Debian Linux. The dump format is GNU tar with gzip compressed on the client side, written to a large holding disk then flushed to tape. The archives on holding disk verify OK, but the problem is

Re: amanda + gzip errors on debian?

2003-07-15 Thread Eric Siegerman
On Tue, Jul 15, 2003 at 12:10:27PM -0400, Kurt Yoder wrote: > However, I > was able to duplicate the problem simply by gzipping a big file to > my ATA/IDE holding disk. So I'm certain it's not a scsi problem. Is it repeatable? I.e. if you gzip the *same* file five times,

Re: amanda + gzip errors on debian?

2003-07-15 Thread Kurt Yoder
Niall O Broin said: > What's your backup device ? If it's a SCSI tape then I'd say your > problem is > most likely SCSI cabling termination. I had this a long time ago and > it drove > me nuts. I eventually found that the SCSI chain wasn't terminated > correctly. > Just like you, I would only e

Re: amanda + gzip errors on debian?

2003-07-15 Thread Niall O Broin
On Tuesday 15 July 2003 16:07, Kurt Yoder wrote: > they seem to go fine. However, upon verifying the backups, I notice > gzip errors. I get two different kinds of errors: "crc" errors and > "format violated" errors. The errors don't happen on all dump > images

amanda + gzip errors on debian?

2003-07-15 Thread Kurt Yoder
Hello list I've been having a problem with amanda and gzip on my debian backup servers for a while now. I do my backups with gzip compression, and they seem to go fine. However, upon verifying the backups, I notice gzip errors. I get two different kinds of errors: "crc" err

Re: amrecover failure, corrupted gzip file?

2003-03-29 Thread Gene Heskett
or other thoughts? Is this the Linux >> > dump/restore problem I've seen talked about on the mailing >> > list? I don't understand how the gzip file could be corrupted >> > by a problem internal to the dump/restore cycle. >> >> Answering my own ques

Re: amrecover failure, corrupted gzip file?

2003-03-29 Thread Jean-Louis Martineau
ughts? Is this the Linux dump/restore > > problem I've seen talked about on the mailing list? I don't > > understand how the gzip file could be corrupted by a problem internal > > to the dump/restore cycle. > > Answering my own question after a week of testing

Re: amrecover failure, corrupted gzip file?

2003-03-29 Thread Gene Heskett
On Fri March 28 2003 23:32, Gene Heskett wrote: >On Fri March 28 2003 12:46, Mike Simpson wrote: >>Hi -- >> >>> Any tips or tricks or other thoughts? Is this the Linux >>> dump/restore problem I've seen talked about on the mailing >>> list? I don&#x

Re: amrecover failure, corrupted gzip file?

2003-03-28 Thread Gene Heskett
On Fri March 28 2003 12:46, Mike Simpson wrote: >Hi -- > >> Any tips or tricks or other thoughts? Is this the Linux >> dump/restore problem I've seen talked about on the mailing list? >> I don't understand how the gzip file could be corrupted by a >> pr

Re: amrecover failure, corrupted gzip file?

2003-03-28 Thread Mike Simpson
Hi -- > Any tips or tricks or other thoughts? Is this the Linux dump/restore > problem I've seen talked about on the mailing list? I don't > understand how the gzip file could be corrupted by a problem internal > to the dump/restore cycle. Answering my own question a

amrecover failure, corrupted gzip file?

2003-03-21 Thread Mike Simpson
rticularly unusual in the amrecover debug file on the client side. The corresponding amidxtaped debug file on the tape host side seemed to be running normally, then terminating on a gzip error: amidxtaped: time 10.959: Ready to execv amrestore with: path = /usr/local/sbin/amrestore argv[0] =

Re: Bug? - gzip running on client AND server

2003-01-15 Thread Gerhard den Hollander
* Orion Poplawski <[EMAIL PROTECTED]> (Wed, Jan 15, 2003 at 12:31:44PM -0700) > Just notice that on at least on of my amanda disk dumps, it is being run > through gzip on client and on the server. The details: > lsof -p 7200: > COMMAND PID USER FD TYPE DEVICESIZE

Re: Bug? - gzip running on client AND server

2003-01-15 Thread Orion Poplawski
Joshua Baker-LePain wrote: On Wed, 15 Jan 2003 at 12:31pm, Orion Poplawski wrote Just notice that on at least on of my amanda disk dumps, it is being run through gzip on client and on the server. The details: I'm pretty sure that the gzip on the server is compressing the index

Re: Bug? - gzip running on client AND server

2003-01-15 Thread Joshua Baker-LePain
On Wed, 15 Jan 2003 at 12:31pm, Orion Poplawski wrote > Just notice that on at least on of my amanda disk dumps, it is being run > through gzip on client and on the server. The details: I'm pretty sure that the gzip on the server is compressing the index file, *not* the du

Bug? - gzip running on client AND server

2003-01-15 Thread Orion Poplawski
Just notice that on at least on of my amanda disk dumps, it is being run through gzip on client and on the server. The details: disklist: lewis /export/lewis3 comp-best-user-tar amanda.conf: define dumptype root-tar { global program "GNUTAR" comment "

Re: Speed of backups under amanda with gpg and gzip wrapper?

2002-01-31 Thread Jennifer Peterson
Does anybody else have additional feedback on this reply from, Greg? We are regarding a secure backup scheme for amanda whereby the backups are passed to a gzip wrapper that encrypts the data with gpg and then forwards it to the real gzip for further compression. I'd wondered abou

Speed of backups under amanda with gpg and gzip wrapper?

2002-01-30 Thread Jennifer Peterson
Hello, I'm currently in the testing phase for switching our amanda backups over to Judith Freeman's secure scheme, using gpg and a gzip wrapper (http://security.uchicago.edu/tools/gpg-amanda.) Everything's working great with our test computers, and, so far, I'm pre

Re: gzip running when "compress none"

2001-10-28 Thread Chris Dahn
On Wednesday 24 October 2001 10:47 am, David Chin wrote: > Howdy, > > I'm running amanda 2.4.2p2 on RH7.1 Linux and HP-UX 10.20, with a Linux box > acting as server. On the server, there is a "gzip --best" process running > even though I have "compress none&

RE: gzip running when "compress none"

2001-10-24 Thread Amanda Admin
[mailto:[EMAIL PROTECTED]]On Behalf Of David Chin > Sent: Wednesday, October 24, 2001 8:48 AM > To: [EMAIL PROTECTED] > Subject: gzip running when "compress none" > > > > Howdy, > > I'm running amanda 2.4.2p2 on RH7.1 Linux and HP-UX 10.20, with a > Lin

Re: gzip running when "compress none"

2001-10-24 Thread Mitch Collinsworth
> I'm running amanda 2.4.2p2 on RH7.1 Linux and HP-UX 10.20, with a Linux box > acting as server. On the server, there is a "gzip --best" process running > even though I have "compress none" in the "global" configuration. Is this > norm

gzip running when "compress none"

2001-10-24 Thread David Chin
Howdy, I'm running amanda 2.4.2p2 on RH7.1 Linux and HP-UX 10.20, with a Linux box acting as server. On the server, there is a "gzip --best" process running even though I have "compress none" in the "global" configuration. Is this normal? --Dave Chin [EMAIL PROTECTED]

Still trying to get gzip/gpg

2001-10-18 Thread ahall
Hello, I am still trying to get gzip/gpg working. I did not receive any replies from my last two mails, so let me try again not so broad. If someone might be able to answer this, that would be awesome: As I understand the process the data should be written to tape with gzip, not dump. But

Re: gzip

2001-02-06 Thread Gerhard den Hollander
* John R. Jackson <[EMAIL PROTECTED]> (Mon, Feb 05, 2001 at 11:59:56PM -0500) >>2. quoting a colocation facilitys website: >>"We use bzip2 instead of gzip for data compression. ... > This comes up here about once a month :-). There was a lengthy discussion > last

Re: gzip

2001-02-05 Thread John R. Jackson
>1. I have all of my gzips set to fast instead of best but whenever amdump is >running there will be a gzip --fast and gzip --best for every file that is >in my holding disk. What are the reasons behind this? The --best one is doing the index files, not the data stream. >2. quoting

gzip

2001-02-05 Thread Ryan Williams
I have 2 questions relating to gzip. 1. I have all of my gzips set to fast instead of best but whenever amdump is running there will be a gzip --fast and gzip --best for every file that is in my holding disk. What are the reasons behind this? 2. quoting a colocation facilitys website: "W

Re: Mac OS X Server problems w/ gzip

2000-11-16 Thread Kevin M. Myer
On Wed, 15 Nov 2000, Sandra Panesso wrote: > Hi Kevin > > I Want to know if you have tried to run amanda on Mac OS X Beta. If you did > please tell me how was it. My question is because I am testing to run amanda > on Mac OS X Beta but I found some problems when i tried to compiled it. I > use

Re: Mac OS X Server problems w/ gzip

2000-11-15 Thread Sandra Panesso
disabled ktrace debugging of the kernel so there wasn't > much I could do to figure out where the problem is. However, recently, I > decided to do a set of dumps with compression turned off. It turns out, > thats where the slowdown is occuring. For some reason, the compression >

Re: Mac OS X Server problems w/ gzip

2000-11-09 Thread Kevin M. Myer
On Thu, 9 Nov 2000, Mitch Collinsworth wrote: > Have you tried compress client fast yet or are you still doing client > best? Yes, actually, I had been using client fast for all my backups. Maybe I would do better with client best :) Still, the thing that irks me most about it is not that the

Re: Mac OS X Server problems w/ gzip

2000-11-09 Thread Mitch Collinsworth
> >... gzip is just really, really slow when used with AMANDA under Mac > >OS X Server. Command line issued tar/gzip pipes seem to work reasonably > >fast on the OS X Server. > > Well, if one of these boxes ever drops in my lap (and I have time), > I guess I can ta

Re: Mac OS X Server problems w/ gzip

2000-11-09 Thread Mitch Collinsworth
on that machine (a 400MHz G4 machine with a Gig of > RAM). So its not merely an issue of gzip compression adding time to the > backups. gzip is just really, really slow when used with AMANDA under Mac > OS X Server. Command line issued tar/gzip pipes seem to work reasonably > fast on t

  1   2   >