Apache::GzipChain Apache::OutputChain

2001-08-08 Thread Riardas epas

Hi!

What does this error messsage mean?
untie attempted while 1 inner references still exist at 
/usr/local/lib/perl5/site_perl/5.6.1/Apache/OutputChain.pm line 28.
Mozilla shows empty screen.  I have in httpd.conf:
Location /scripts
SetHandler perl-script
PerlHandler Apache::OutputChain Apache::GzipChain My::Site
/Location

Summary of my perl5 (revision 5.0 version 6 subversion 1) configuration:
  Platform:
osname=freebsd, osvers=4.3-stable, archname=i386-freebsd
uname='freebsd richard.eu.org 4.3-stable freebsd 4.3-stable #0: sat apr 28 
12:13:21 eet 2001 [EMAIL PROTECTED]:usrsrcsyscompilemykernel i386 '
config_args=''
hint=recommended, useposix=true, d_sigaction=define
usethreads=undef use5005threads=undef useithreads=undef usemultiplicity=undef
useperlio=undef d_sfio=undef uselargefiles=define usesocks=undef
use64bitint=undef use64bitall=undef uselongdouble=undef
  Compiler:
cc='cc', ccflags ='-fno-strict-aliasing -I/usr/local/include',
optimize='-pipe -O2 -march=i686',
cppflags='-fno-strict-aliasing -I/usr/local/include'
ccversion='', gccversion='2.95.3 [FreeBSD] 20010315 (release)', gccosandvers=''
intsize=4, longsize=4, ptrsize=4, doublesize=8, byteorder=1234
d_longlong=define, longlongsize=8, d_longdbl=define, longdblsize=12
ivtype='long', ivsize=4, nvtype='double', nvsize=8, Off_t='off_t', lseeksize=8
alignbytes=4, usemymalloc=n, prototype=define
  Linker and Libraries:
ld='cc', ldflags ='-Wl,-E  -L/usr/local/lib'
libpth=/usr/lib /usr/local/lib
libs=-lgdbm -lm -lc -lcrypt -liconv -lutil
perllibs=-lm -lc -lcrypt -liconv -lutil
libc=, so=so, useshrplib=true, libperl=libperl.so
  Dynamic Linking:
dlsrc=dl_dlopen.xs, dlext=so, d_dlsymun=undef, ccdlflags='  
-Wl,-R/usr/lib/perl5/5.6.1/i386-freebsd/CORE'
cccdlflags='-DPIC -fpic', lddlflags='-shared  -L/usr/local/lib'


Characteristics of this binary (from libperl): 
  Compile-time options: USE_LARGE_FILES
  Built under freebsd
  Compiled at May  4 2001 20:25:10
  @INC:
/usr/lib/perl5/5.6.1/i386-freebsd
/usr/lib/perl5/5.6.1
/usr/local/lib/perl5/site_perl/5.6.1/i386-freebsd
/usr/local/lib/perl5/site_perl/5.6.1
/usr/local/lib/perl5/site_perl/5.005/i386-freebsd
/usr/local/lib/perl5/site_perl/5.005
/usr/local/lib/perl5/site_perl
.

-- 
  ☻ Ričardas Čepas ☺
~~
~



RE: Apache::GzipChain Apache::OutputChain

2001-08-08 Thread Geoffrey Young



 -Original Message-
 From: Ricardas Cepas [mailto:[EMAIL PROTECTED]]
 Sent: Wednesday, August 08, 2001 2:08 PM
 To: [EMAIL PROTECTED]
 Subject: Apache::GzipChain  Apache::OutputChain
 
 
 Hi!
 
 What does this error messsage mean?
 untie attempted while 1 inner references still exist at 
 /usr/local/lib/perl5/site_perl/5.6.1/Apache/OutputChain.pm line 28.

it means what it says :)  if OutputChain works like Apache::Filter at all,
then it needs the reference to maintain an accurate handler count, so you
can safely igore the warning.  I haven't looked at the code, though.  

 Mozilla shows empty screen.  I have in httpd.conf:
 Location /scripts
 SetHandler perl-script
 PerlHandler Apache::OutputChain Apache::GzipChain My::Site
 /Location

you would be better off moving to Apache::Filter and Apache::Compress, both
of which are well developed and maintained... 

I think the general opinion is that OutputChain and GzipChain are becoming
more deprecated by the day...

--Geoff



Re: Apache::GzipChain

2000-11-05 Thread Ken Williams

[EMAIL PROTECTED] (Jerrad Pierce) wrote:
Is anybody using GzipChain?

Is there some known means of verifying that it is in fact working properly?

(Other than watching an unreliable browser progress bar)

Note that Apache::Compress is out there too.  It's a newer module,
cooperates with Apache::Filter, and seems to work in general.  People
seem to be using it.

Just wanted to make sure you knew about it.


  ------
  Ken Williams Last Bastion of Euclidity
  [EMAIL PROTECTED]The Math Forum



Re: Apache::GzipChain

2000-10-30 Thread Matt Sergeant

On Sat, 28 Oct 2000, Jerrad Pierce wrote:

 Is anybody using GzipChain?

Not in itself, but I'm using AxKit which also does gzip compression.

 Is there some known means of verifying that it is in fact working properly?

Yes, use:

lwp-request -H 'Accept-Encoding: gzip' -e -d url

Or ommit the -d and check for gobbledegook. But it tends to stuff up your
terminal :-)

-- 
Matt/

/||** Director and CTO **
   //||**  AxKit.com Ltd   **  ** XML Application Serving **
  // ||** http://axkit.org **  ** XSLT, XPathScript, XSP  **
 // \\| // ** Personal Web Site: http://sergeant.org/ **
 \\//
 //\\
//  \\




Re: Apache::GzipChain

2000-10-30 Thread G.W. Haywood

Hi there,

On Mon, 30 Oct 2000, Matt Sergeant wrote:

 On Sat, 28 Oct 2000, Jerrad Pierce wrote:
  Is there some known means of verifying that it is in fact working properly?
 lwp-request -H 'Accept-Encoding: gzip' -e -d url
 Or ommit the -d and check for gobbledegook. But it tends to stuff up your
 terminal :-)

If you're using Linux just type 'reset' after it's done that.

73,
Ged.




Re: Apache::GzipChain

2000-10-29 Thread G.W. Haywood

Hi again,

On Sat, 28 Oct 2000, G.W. Haywood wrote:
 On Sat, 28 Oct 2000, Jerrad Pierce wrote:
  Is anybody using GzipChain?
 IIRC, Josh said he was.

There are apparently some problems with IE claiming to support it and
then not supporting it.  Quote from Josh, edited to preserve anonimity:

--
The compression stuff is amazing, for 10% CPU, a 40K doc like [menu]
can be squeezed down to 5K.  I already shaved off 10% request time by
optimizing [module names] so that's a wash.  There was a problem at
[client name] with a couple users with proxy configured for IE, but
not really using one for some reason, and I never worked past that
issue, caught up in the rest.
--

  Is there some known means of verifying that it is in fact working properly?

Have you some reason to suspect it isn't?  Send something to yourself?
Ask your users?

73,
Ged.






Re: Apache::GzipChain

2000-10-28 Thread G.W. Haywood

Hi there,

On Sat, 28 Oct 2000, Jerrad Pierce wrote:

 Is anybody using GzipChain?

IIRC, Josh said he was.  He didn't complain about it.  Raved, in fact.

 Is there some known means of verifying that it is in fact working properly?

LWP?

73,
Ged.




Re: Apache::GzipChain

2000-10-28 Thread Tom Brown

On Sat, 28 Oct 2000, G.W. Haywood wrote:

 Hi there,
 
 On Sat, 28 Oct 2000, Jerrad Pierce wrote:
 
  Is anybody using GzipChain?
 
 IIRC, Josh said he was.  He didn't complain about it.  Raved, in fact.
 
  Is there some known means of verifying that it is in fact working properly?
 
 LWP?

better to use your logs... LWP won't trigger it... if you download a 100k
page when you look at "view page info" or save it to disk, but your access
log shows 15k, you know it's doing it's job...  (hhmm, seems I was using
Apache::Gzip until I got my ADSL back ... but at that time, it was a
non-trivial exercise, and compressing _everything_ (including PHP scripts
etc...) required using LWP internally, which worked even better for
checking functionality, because you had two log entries, the raw one from
localhost, and the one compressed one from the remote agent, with
"appropriate" variances in their sizes :-)

 
 73,
 Ged.
 

--
[EMAIL PROTECTED]   | Don't go around saying the world owes you a living;
http://BareMetal.com/  | the world owes you nothing; it was here first.
web hosting since '95  | - Mark Twain




Apache::GzipChain and scalability

2000-05-26 Thread Bruce Lo

I tried out Apache::GzipChain for dynamic mod_perl pages (using Apache::Registry), and 
it was great for reducing the download time (especially over modem).  I am seriously 
thinking about using it for our production environment.  However, some people are 
concerned about it using up too much resource.  Has anyone looked into scalability 
issues?  Would I see significant reduced throughput using GzipChain?

Also why don't most sites gzip their pages (do redirect based on browser support)?


___

Why pay when you don't have to? Get AltaVista Free Internet Access now! 
http://jump.altavista.com/freeaccess4.go

___




Repost: Apache::GzipChain and scalability

2000-05-26 Thread Bruce Fang-hsu Lo


I tried out Apache::GzipChain for dynamic mod_perl pages (using 
Apache::Registry), and it was great for reducing the download ime
(especially over modem).  I am seriously thinking about using it for our
production environment.  However, some people are concerned about it using
up too much resource.  Has anyone looked into scalability issues?  Would I
see significant reduced throughput using GzipChain? 
 
Also why don't most sites gzip their pages (do redirect based on browser
support)?

Thanks.





Re: Apache::GzipChain and scalability

2000-05-26 Thread Devin Ben-Hur

Bruce Lo wrote:
 
 I tried out Apache::GzipChain for dynamic mod_perl pages (using Apache::Registry), 
and it was great for reducing the download time (especially over modem).  I am 
seriously thinking about using it for our production environment.  However, some 
people are concerned about it using up too much resource.  Has anyone looked into 
scalability issues?  Would I see significant reduced throughput using GzipChain?

We've been gzipping for a while at eMerchandise.com (though not using
gzip chain). We addressed this issue by making the gzip pass decide
whether to just pass it through or to do the compression based on
current CPU load on the server.  So when you've got extra cycles you
shrink the file to improve bandwidth utilization, if you're running near
peak processor utilization you send the bytes raw.

We've had no scaling problems. What kind of system load do your
production server(s) see now?  What is it during peak traffic periods?

 Also why don't most sites gzip their pages (do redirect based on browser support)?

Because they're lazy or stupid? :)

-- 
Devin Ben-Hur | President / CTO  | mailto:[EMAIL PROTECTED]
The eMarket Group | eMerchandise.com | http://www.eMerchandise.com
503/944-5044 x228 | 
"Forrester Research projects that by 2003, Internet start-ups will have
 focused so relentlessly on infrastructure that there will be no 
 remaining actual content on the Web. "  -- Salon.com 14-Apr-2000



Re: Apache::GzipChain and scalability

2000-05-26 Thread Drew Taylor

Devin Ben-Hur wrote:
 
 Bruce Lo wrote:
 
  I tried out Apache::GzipChain for dynamic mod_perl pages (using Apache::Registry), 
and it was great for reducing the download time (especially over modem).  I am 
seriously thinking about using it for our production environment.  However, some 
people are concerned about it using up too much resource.  Has anyone looked into 
scalability issues?  Would I see significant reduced throughput using GzipChain?
 
 We've been gzipping for a while at eMerchandise.com (though not using
 gzip chain). We addressed this issue by making the gzip pass decide
 whether to just pass it through or to do the compression based on
 current CPU load on the server.  So when you've got extra cycles you
 shrink the file to improve bandwidth utilization, if you're running near
 peak processor utilization you send the bytes raw.
Devin,

I have read debates in the past about which browsers will reliably
accept gzip content. Do you have a list of such browsers? IIRC, it was
IE that was the most troublesome w/ proper display.

-- 
Drew Taylor
Vialogix Communications, Inc.
501 N. College Street
Charlotte, NC 28202
704 370 0550
http://www.vialogix.com/



Re: Apache::GzipChain and scalability

2000-05-26 Thread Randal L. Schwartz

 "Drew" == Drew Taylor [EMAIL PROTECTED] writes:

Drew I have read debates in the past about which browsers will reliably
Drew accept gzip content. Do you have a list of such browsers? IIRC, it was
Drew IE that was the most troublesome w/ proper display.

Why base it on browser?  Won't checking the request "accept:" for
/gzip/ be enough?  Then the browser can tell you whether it works or
not, and you can just have an exceptions list for those that lie.

-- 
Randal L. Schwartz - Stonehenge Consulting Services, Inc. - +1 503 777 0095
[EMAIL PROTECTED] URL:http://www.stonehenge.com/merlyn/
Perl/Unix/security consulting, Technical writing, Comedy, etc. etc.
See PerlTraining.Stonehenge.com for onsite and open-enrollment Perl training!



Re: Apache::GzipChain and scalability

2000-05-26 Thread Drew Taylor

"Randal L. Schwartz" wrote:
 
  "Drew" == Drew Taylor [EMAIL PROTECTED] writes:
 
 Drew I have read debates in the past about which browsers will reliably
 Drew accept gzip content. Do you have a list of such browsers? IIRC, it was
 Drew IE that was the most troublesome w/ proper display.
 
 Why base it on browser?  Won't checking the request "accept:" for
 /gzip/ be enough?  Then the browser can tell you whether it works or
 not, and you can just have an exceptions list for those that lie.
Well, in the discussion I was following, it seemed that the browsers
didn't always do what they said. IE seemed to be the usual culprit. ;-)

I probably won't be using compression any time soon as most of my
bandwidth comes from images. But it's good to know that it does work
pretty reliably.

-- 
Drew Taylor
Vialogix Communications, Inc.
501 N. College Street
Charlotte, NC 28202
704 370 0550
http://www.vialogix.com/



Re: Apache::GzipChain

2000-04-11 Thread Rick Myers

On Apr 11, 2000 at 07:49:57 -0400, Paul G. Weiss twiddled the keys to say:
 I was playing around with this module and got strange
 results (both with MSIE 5.0 and Netscape 4.6).  The
 output is being sent chunked and when I do "view source"
 it appears that the browsers have not received the
 complete page.  I suspect that they are only reading
 up to the first "chunk".

Your suspicion is correct. The problem however lies in whatever module
your using to feed GzipChain. In my case it was Apache::PassFile, which
sends in 16k chunks. Apache::PassHtml looks like it suffers from a
similar problem, but I've not tried it.

Rick Myers[EMAIL PROTECTED]

The Feynman Problem   1) Write down the problem.
Solving Algorithm 2) Think real hard.
  3) Write down the answer.



RE: Apache::GzipChain

2000-04-11 Thread Paul G. Weiss

I'm using Apache::RegistryNG to feed GzipChain.  But doesn't
the chunking occur *after* GzipChain?  I've also tried
Apache::Registry with the same results.

-P

 -Original Message-
 From: Rick Myers [mailto:[EMAIL PROTECTED]]
 Sent: Tuesday, April 11, 2000 8:22 AM
 To: Paul G. Weiss
 Subject: Re: Apache::GzipChain
 
 
 On Apr 11, 2000 at 08:19:50 -0400, Rick Myers twiddled the 
 keys to say:
  
  Your suspicion is correct. The problem however lies in 
 whatever module
  your using to feed GzipChain. In my case it was 
 Apache::PassFile, which
  sends in 16k chunks. Apache::PassHtml looks like it suffers from a
  similar problem, but I've not tried it.
 
 Whoops.. I'm a bit quick on the send button this morning.
 
 Here's the patch for PassFile..
 
 --- PassFile.pm.orig Fri Mar 31 05:18:32 2000
 +++ PassFile.pmFri Mar 31 05:10:39 2000
 @@ -30,9 +30,8 @@
  my($buf,$read);
  local $\;
 
 +my $size = (stat _)[7];
  while (){
 +  defined($read = sysread($fh, $buf, $size)) or return 
 SERVER_ERROR;
 -  defined($read = sysread($fh, $buf, $BUFFERSIZE)) or 
 return SERVER_ERROR;
last unless $read;
print $buf;
  }
 
 Rick Myers[EMAIL PROTECTED]
 
 The Feynman Problem   1) Write down the problem.
 Solving Algorithm 2) Think real hard.
   3) Write down the answer.
 



Apache::GzipChain

2000-04-11 Thread Paul G. Weiss

I was playing around with this module and got strange
results (both with MSIE 5.0 and Netscape 4.6).  The
output is being sent chunked and when I do "view source"
it appears that the browsers have not received the
complete page.  I suspect that they are only reading
up to the first "chunk".

When I use lwp-download to read the page and then run
the contents (without the headers) through gunzip then
the complete page is indeed there.

Is there something to using this module that I'm missing?
Is there a way to force Apache to use Apache::GzipChain 
but not chunk the output?

-P



Re: Apache::GzipChain

2000-04-11 Thread Rick Myers

On Apr 11, 2000 at 08:36:42 -0400, Paul G. Weiss twiddled the keys to say:
 I'm using Apache::RegistryNG to feed GzipChain.  But doesn't
 the chunking occur *after* GzipChain?  I've also tried
 Apache::Registry with the same results.

From the Apache::GzipChain man page..

   GzipChain compresses every single buffer content it
   receives via the output chain separately according to the
   GZIP specification (RFC 1952). The compression ratio
   therefore suffers if the other module sends its data in
   very small chunks.

So apparently the browsers can't cope with multiple gzip'ed chunks. If
your send the content to GzipChain in one large hunk the problem goes
away.

Rick Myers[EMAIL PROTECTED]

The Feynman Problem   1) Write down the problem.
Solving Algorithm 2) Think real hard.
  3) Write down the answer.



Re: Apache::GzipChain and Netscape 4.7 doesnt work with huge files??

2000-03-30 Thread Honza Pazdziora

On Mon, Mar 27, 2000 at 05:05:47PM +0200, Janning Vygen wrote:

 of the page. calling "view page source" only shows half of the content. 
 
 It does work with some other and smaller files. With huge files (about 50k and
 lots of table rows and data) its broken.
 
 does anybody else notice this strange behaviour of netscape 4.7???

I'm unable to reproduce it here. What happens if you just take the
content, gzip it off-line and put it on server as html.gz. Will NN
gunzip is OK? It you do a telnet yourserver 80 and request that file,
do the headers differ?

-- 

 Honza Pazdziora | [EMAIL PROTECTED] | http://www.fi.muni.cz/~adelton/
   .project: Perl, DBI, Oracle, MySQL, auth. WWW servers, MTB, Spain.