-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/33112/
-----------------------------------------------------------

(Updated April 15, 2015, 3:56 a.m.)


Review request for nutch.


Bugs: NUTCH-1927
    https://issues.apache.org/jira/browse/NUTCH-1927


Repository: nutch


Description
-------

Based on discussion on the dev list, to use Nutch for some security research 
valid use cases (DDoS; DNS and other testing), I am going to create a patch 
that allows a whitelist:
<property>
  <name>robot.rules.whitelist</name>
  <value>132.54.99.22,hostname.apache.org,foo.jpl.nasa.gov</value>
  <description>Comma separated list of hostnames or IP addresses to ignore 
robot rules parsing for.
  </description>
</property>


Diffs (updated)
-----

  ./trunk/CHANGES.txt 1673623 
  ./trunk/conf/nutch-default.xml 1673623 
  ./trunk/src/java/org/apache/nutch/protocol/RobotRules.java 1673623 
  ./trunk/src/java/org/apache/nutch/protocol/RobotRulesParser.java 1673623 
  ./trunk/src/java/org/apache/nutch/protocol/WhiteListRobotRules.java 
PRE-CREATION 
  
./trunk/src/plugin/lib-http/src/java/org/apache/nutch/protocol/http/api/HttpRobotRulesParser.java
 1673623 
  
./trunk/src/plugin/protocol-ftp/src/java/org/apache/nutch/protocol/ftp/FtpRobotRulesParser.java
 1673623 

Diff: https://reviews.apache.org/r/33112/diff/


Testing
-------

Tested using: RobotRulesParser in the o.a.n.protocol package against my home 
server. Robots.txt looks like:

[chipotle:~/src/nutch] mattmann% more robots.txt 
User-agent: *
Disallow: /
[chipotle:~/src/nutch] mattmann% 

urls file:

[chipotle:~/src/nutch] mattmann% more urls 
http://baron.pagemewhen.com/~chris/foo1.txt
http://baron.pagemewhen.com/~chris/
[chipotle:~/src/nutch] mattmann% 

[chipotle:~/src/nutch] mattmann% java -cp 
build/apache-nutch-1.10-SNAPSHOT.job:build/apache-nutch-1.10-SNAPSHOT.jar:runtime/local/lib/hadoop-core-1.2.0.jar:runtime/local/lib/crawler-commons-0.5.jar:runtime/local/lib/slf4j-log4j12-1.6.1.jar:runtime/local/lib/slf4j-api-1.7.9.jar:runtime/local/lib/log4j-1.2.15.jar:runtime/local/lib/guava-11.0.2.jar:runtime/local/lib/commons-logging-1.1.1.jar
 org.apache.nutch.protocol.RobotRulesParser robots.txt urls Nutch-crawler
Apr 12, 2015 9:22:50 AM org.apache.nutch.protocol.WhiteListRobotRules 
isWhiteListed
INFO: Host: [baron.pagemewhen.com] is whitelisted and robots.txt rules parsing 
will be ignored
allowed:        http://baron.pagemewhen.com/~chris/foo1.txt
Apr 12, 2015 9:22:50 AM org.apache.nutch.protocol.WhiteListRobotRules 
isWhiteListed
INFO: Host: [baron.pagemewhen.com] is whitelisted and robots.txt rules parsing 
will be ignored
allowed:        http://baron.pagemewhen.com/~chris/
[chipotle:~/src/nutch] mattmann%


Thanks,

Chris Mattmann

Reply via email to