[
https://issues.apache.org/jira/browse/DROIDS-105?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Richard Frovarp updated DROIDS-105:
-----------------------------------
Fix Version/s: (was: 0.2.0)
0.3.0
> missing caching for robots.txt
> ------------------------------
>
> Key: DROIDS-105
> URL: https://issues.apache.org/jira/browse/DROIDS-105
> Project: Droids
> Issue Type: Improvement
> Components: core
> Reporter: Paul Rogalinski
> Fix For: 0.3.0
>
> Attachments: Caching-Support-and-Robots_txt-fix.patch,
> CachingContentLoader.java
>
>
> the current implementation of the HttpClient will not cache any requests to
> the robots.txt file. While using the CrawlingWorker this will result in 2
> requests to the robots.txt (HEAD + GET) per crawled URL. So when crawling 3
> URLs the target server would get 6 requests for the robots.txt.
> unfortunately the contentLoader is made final in HttpProtocol, so there is no
> possibility to replace it with a caching Protocol like that one you'll find
> in the attachment.
--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators:
https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira