Hi debianers,
I've contacted squid's upstream to help clarifying some details in this thread
and am now forwarding Amos' reply:
Thanks Luigi, you may have to relay this back to the list. I can't seem to
post a reply to the thread.
I looked at that Debian bug a while back when first
Amos Jeffries a...@treenet.co.nz writes:
Strange that you should not know where the patch is Goswin since you
were the first and only one to mention it in this thread.
The answer is in the upstream bug report.
http://bugs.squid-cache.org/show_bug.cgi?id=2624
Actualy the answere is in the
Strange that you should not know where the patch is Goswin since you
were the first and only one to mention it in this thread.
The answer is in the upstream bug report.
http://bugs.squid-cache.org/show_bug.cgi?id=2624
It should be noted that patch only affects the IMS replies apt gets back
On Wed, May 19, 2010 at 03:28:00PM +0200, Bjørn Mork wrote:
Pierre Habouzit madco...@madism.org writes:
On Wed, May 19, 2010 at 10:42:55AM +0200, Bjørn Mork wrote:
2) http proxy servers cannot always process pipelined requests due to
the complexity this adds (complexity is always bad
Petter Reinholdtsen p...@hungry.com writes:
[Roger Lynn]
But apt has been using pipelining for years. Why has this only just
become a problem?
It has been a problem in Debian Edu for years. Just recently I
figured out the cause and a workaround.
And FWIW I have experienced this problem for
On Wed, May 19, 2010 at 10:42:55AM +0200, Bjørn Mork wrote:
2) http proxy servers cannot always process pipelined requests due to
the complexity this adds (complexity is always bad for security), and
This is bullshit. It's *VERY* easy to support pipelining: parse one
request at a time, and
Pierre Habouzit madco...@madism.org writes:
On Wed, May 19, 2010 at 10:42:55AM +0200, Bjørn Mork wrote:
2) http proxy servers cannot always process pipelined requests due to
the complexity this adds (complexity is always bad for security), and
This is bullshit. It's *VERY* easy to support
Robert Collins robe...@robertcollins.net writes:
Well, I don't know why something has 'suddenly' become a problem: its
a known issue for years. The HTTP smuggling
[http://www.watchfire.com/resources/HTTP-Request-Smuggling.pdf]
attacks made that very obvious 5 years ago now.
Reading that I
On 2010-05-19, Goswin von Brederlow goswin-...@web.de wrote:
Reading that I don't think that is really a pipelining issue. You do not
need pipelineing for it to work. The real problem is keep-alive. The
connection isn't destroyed after each request so you can put multiple
requests into the
Bjørn Mork bj...@mork.no writes:
Petter Reinholdtsen p...@hungry.com writes:
[Roger Lynn]
But apt has been using pipelining for years. Why has this only just
become a problem?
It has been a problem in Debian Edu for years. Just recently I
figured out the cause and a workaround.
And
Bjørn Mork bj...@mork.no writes:
Pierre Habouzit madco...@madism.org writes:
On Wed, May 19, 2010 at 10:42:55AM +0200, Bjørn Mork wrote:
2) http proxy servers cannot always process pipelined requests due to
the complexity this adds (complexity is always bad for security), and
This is
#include hallo.h
* Robert Collins [Tue, May 18 2010, 02:02:59PM]:
Given that pipelining is broken by design, that the HTTP WG has
And if not? Counter example, it seems to work just fine with my
apt-cacher-ng proxy, at least bug reports related to that have appeared
for about a year now.
Goswin von Brederlow goswin-...@web.de writes:
A HTTP/1.1 conforming server or proxy
This is not the real world...
is free to process pipelined
requests serially one by one. The only requirement is that it does not
corrupt the second request by reading all available data into a buffer,
Hi all,
i don't want to interrupt your battles so feel free to ignore me,
but i want to raise some questions (for you and me) none the less:
The notice about the - in the eyes of the writer of this manpage
section - broken squid version 2.0.2 in the apt.conf manpage
was changed the last time in
On Wed, May 19, 2010 at 03:28:00PM +0200, Bjørn Mork bj...@mork.no was heard
to say:
Pierre Habouzit madco...@madism.org writes:
This is bullshit. It's *VERY* easy to support pipelining: parse one
request at a time, and until you're done with a given request, you just
stop to watch the
Philipp Kern tr...@philkern.de writes:
On 2010-05-19, Goswin von Brederlow goswin-...@web.de wrote:
Reading that I don't think that is really a pipelining issue. You do not
need pipelineing for it to work. The real problem is keep-alive. The
connection isn't destroyed after each request so
Bjørn Mork bj...@mork.no writes:
Goswin von Brederlow goswin-...@web.de writes:
A HTTP/1.1 conforming server or proxy
This is not the real world...
is free to process pipelined
requests serially one by one. The only requirement is that it does not
corrupt the second request by reading
David Kalnischkies kalnischkies+deb...@gmail.com writes:
Hi all,
i don't want to interrupt your battles so feel free to ignore me,
but i want to raise some questions (for you and me) none the less:
The notice about the - in the eyes of the writer of this manpage
section - broken squid
Daniel Burrows dburr...@debian.org writes:
On Wed, May 19, 2010 at 03:28:00PM +0200, Bjørn Mork bj...@mork.no was
heard to say:
Pierre Habouzit madco...@madism.org writes:
This is bullshit. It's *VERY* easy to support pipelining: parse one
request at a time, and until you're done with a
Marvin Renich m...@renich.org writes:
* Robert Collins robe...@robertcollins.net [100517 17:42]:
Due to the widespread usage of intercepting proxies, its very hard, if
not impossible, to determine if a proxy is in use. Its unwise, at
best, to assume that no proxy configured == no proxy
* Robert Collins robe...@robertcollins.net [100517 22:03]:
Given that pipelining is broken by design, that the HTTP WG has
increased the number of concurrent connections that are recommended,
and removed the upper limit - no. I don't think that disabling
pipelining hurts anyone - just use a
* Goswin von Brederlow goswin-...@web.de [100518 02:53]:
Marvin Renich m...@renich.org writes:
Documenting this problem somewhere that an admin would look when seeing
the offending Hash sum mismatch message would also help. Turning off
pipelining by default for everybody seems like the
On Mon, May 17, 2010 at 09:54:28PM +0200, Florian Weimer wrote:
* Petter Reinholdtsen:
I am bothered by URL: http://bugs.debian.org/56 , and the fact
that apt(-get,itude) do not work with Squid as a proxy. I would very
much like to have apt work out of the box with Squid in Squeeze.
Il giorno 17/mag/2010, alle ore 09.02, Goswin von Brederlow ha scritto:
Given that squid already has a patch, although only for newer versions,
this really seems to be a squid bug. As such it should be fixed in
squid as not only apt might trigger the problem.
Goswin, can you please point me to
On Tue, May 18, 2010 at 02:09:13PM +0200, Mike Hommey wrote:
Mozilla browsers have had pipelining disabled for years, because
reality is that a whole lot of servers don't implement it properly if at
all.
Actually, I've had pipelining enabled for some time, and it works just
fine for me. I
Luigi Gangitano lu...@debian.org writes:
Il giorno 17/mag/2010, alle ore 09.02, Goswin von Brederlow ha scritto:
Given that squid already has a patch, although only for newer versions,
this really seems to be a squid bug. As such it should be fixed in
squid as not only apt might trigger the
On 18/05/10 03:10, Robert Collins wrote:
Given that pipelining is broken by design, that the HTTP WG has
increased the number of concurrent connections that are recommended,
and removed the upper limit - no. I don't think that disabling
pipelining hurts anyone - just use a couple more
Well, I don't know why something has 'suddenly' become a problem: its
a known issue for years. The HTTP smuggling
[http://www.watchfire.com/resources/HTTP-Request-Smuggling.pdf]
attacks made that very obvious 5 years ago now.
http://en.wikipedia.org/wiki/HTTP_pipelining has a decent overview.
On 19 May 2010 13:51, Robert Collins robe...@robertcollins.net wrote:
Well, I don't know why something has 'suddenly' become a problem: its
a known issue for years. The HTTP smuggling
[http://www.watchfire.com/resources/HTTP-Request-Smuggling.pdf]
attacks made that very obvious 5 years ago
Bah, link staleness.
http://www.cgisecurity.com/lib/HTTP-Request-Smuggling.pdf just worked for me.
Also, I realise that there may be a disconnect here: squid *shouldn't*
break if a client attempts to pipeline through it - if it is, thats a
bug to be fixed, squid just will not read the second
[Roger Lynn]
But apt has been using pipelining for years. Why has this only just
become a problem?
It has been a problem in Debian Edu for years. Just recently I
figured out the cause and a workaround.
Happy hacking,
--
Petter Reinholdtsen
--
To UNSUBSCRIBE, email to
Petter Reinholdtsen p...@hungry.com writes:
I am bothered by URL: http://bugs.debian.org/56 , and the fact
that apt(-get,itude) do not work with Squid as a proxy. I would very
much like to have apt work out of the box with Squid in Squeeze. To
fix it one can either change Squid to work
* Petter Reinholdtsen:
I am bothered by URL: http://bugs.debian.org/56 , and the fact
that apt(-get,itude) do not work with Squid as a proxy. I would very
much like to have apt work out of the box with Squid in Squeeze. To
fix it one can either change Squid to work with pipelining the
Due to the widespread usage of intercepting proxies, its very hard, if
not impossible, to determine if a proxy is in use. Its unwise, at
best, to assume that no proxy configured == no proxy processing your
traffic :(.
-Rob
--
To UNSUBSCRIBE, email to debian-devel-requ...@lists.debian.org
with
* Robert Collins robe...@robertcollins.net [100517 17:42]:
Due to the widespread usage of intercepting proxies, its very hard, if
not impossible, to determine if a proxy is in use. Its unwise, at
best, to assume that no proxy configured == no proxy processing your
traffic :(.
-Rob
IANADD,
Given that pipelining is broken by design, that the HTTP WG has
increased the number of concurrent connections that are recommended,
and removed the upper limit - no. I don't think that disabling
pipelining hurts anyone - just use a couple more concurrent
connections.
-Rob
--
To UNSUBSCRIBE,
On Tue, 2010-05-18 at 14:02 +1200, Robert Collins wrote:
Given that pipelining is broken by design, that the HTTP WG has
increased the number of concurrent connections that are recommended,
and removed the upper limit - no. I don't think that disabling
pipelining hurts anyone - just use a
I am bothered by URL: http://bugs.debian.org/56 , and the fact
that apt(-get,itude) do not work with Squid as a proxy. I would very
much like to have apt work out of the box with Squid in Squeeze. To
fix it one can either change Squid to work with pipelining the way APT
uses, which the Squid
38 matches
Mail list logo