Oded Arbel wrote:

On Sunday 21 September 2003 23:54, Tzafrir Cohen wrote:


apparently, the browser gets stuck when ob_gz_handler is active and there
is data before the handler is called. this data gets flushed out of the
output buffer and is sent infront of the gzip compressed, but after the
headers - which messes up stuff. additionally, ob_gz_handler waits for
the entire page to render before compressing it and sending it all away
in one big chunk - unlike using zlib.output_compression which isn't
affected by whitespace at the header and is compressing on the fly.

What I don't understand is why I see this as a network error - tcpdumping
at the server I can see the reponse packet being sent out but never
getting an ack. Also - I don't understand why it eoes work after a long
time - giving the browser something like two minutes will allow it to
eventually get the page up. this behavior is not consistant with either
network problem or gzip compression issue.


Large packets? fragmantation of some sort? (MAC level? TCP level?)



I've tried setting MTU to as low as 700, and then I see the winsize for the ersulting packets is 2000 something. not sure what it means, but packets still get stuck. Can you suggest something that I should try ?


Window size has nothing to do with fragmentation.

Can you send a dump of the TCP exchange from the server's point of view (trying from the client produces no interesting problems).



You got large chunks when ob_gz_handler was working.



Ye, got that one :-) only trouble is - why would this be a problem ?






--
Shachar Shemesh
Open Source integration consultant
Home page & resume - http://www.shemesh.biz/



=================================================================
To unsubscribe, send mail to [EMAIL PROTECTED] with
the word "unsubscribe" in the message body, e.g., run the command
echo unsubscribe | mail [EMAIL PROTECTED]



Reply via email to