http.content.limit is broken in the protocol-httpclient plugin
--------------------------------------------------------------
Key: NUTCH-481
URL: https://issues.apache.org/jira/browse/NUTCH-481
Project: Nutch
Issue Type: Bug
Components: fetcher
Affects Versions: 0.9.0
Reporter: charlie wanek
When using the protocol-httpclient plugin, the entire contents of the request
URL is retrieved, regardless of the http.content.limit configuration setting.
(The issue does not affect the protocol-http plugin.)
For very large documents, this leads the Fetcher to believe that the
FetcherThread is hung, and the Fetcher aborts its run, logging a warning about
hung threads (Fetcher.java:433).
org.apache.nutch.protocol.httpclient.HttpResponse is properly counting the
content length, and is breaking its read loop at the proper point.
However, when HttpResponse closes the InputStream from which it is reading, the
InputStream object (an org.apache.commons.httpclient.AutoCloseInputStream)
continues to read all of the content of the document from the webserver.
Though I'm not certain this is the correct solution, a quick test shows that if
HttpResponse is changed to abort the GET, the InputStream correctly aborts the
read from the webserver, and the FetcherThread can continue.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.
-------------------------------------------------------------------------
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/
_______________________________________________
Nutch-developers mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/nutch-developers