Hi.
I write a robot for a search engine. The robot must harvest all files
which are shorter than a few kilobytes (let's say 100kB) - longer files
are not important, because they are often archives or long sheets about
nothing.
I cannot find a robust style in which I could drop a connection
My second point is related to retrying you have in your docs
(http://jakarta.apache.org/commons/httpclient/tutorial.html - catch
block of HttpRecovableException). When I do something like this, I
found out that I had to call method.recycle() in the catch block, or
the connection was not
Hi Leo,
Here are a few additions to Arian's comments.
I cannot find a robust style in which I could drop a connection (GET
over HTTP/1.0 and HTTP/1.1) when the incoming data stream exceeds the
upper limit. I do it by closing the input stream, which is constructed
by getResponseAsStream,