[jira] [Comment Edited] (NUTCH-1031) Delegate parsing of robots.txt to crawler-commons

2013-05-09 Thread Tejas Patil (JIRA)

[ 
https://issues.apache.org/jira/browse/NUTCH-1031?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13652805#comment-13652805
 ] 

Tejas Patil edited comment on NUTCH-1031 at 5/9/13 7:52 AM:


I had forgot to add crawler-commons dependency in pom.xml. 
Just committed that to trunk(rev 1480551) and 2.x (rev 1480550).

  was (Author: tejasp):
I had forgot to add crawler-commons dependency in pom.xml. 
Just committed that to trunk(rev 1480551) and 2.x (rev 1480551).
  
> Delegate parsing of robots.txt to crawler-commons
> -
>
> Key: NUTCH-1031
> URL: https://issues.apache.org/jira/browse/NUTCH-1031
> Project: Nutch
>  Issue Type: Task
>Reporter: Julien Nioche
>Assignee: Tejas Patil
>Priority: Minor
>  Labels: robots.txt
> Fix For: 1.7, 2.2
>
> Attachments: CC.robots.multiple.agents.patch, 
> CC.robots.multiple.agents.v2.patch, NUTCH-1031-2.x.v1.patch, 
> NUTCH-1031-trunk.v2.patch, NUTCH-1031-trunk.v3.patch, 
> NUTCH-1031-trunk.v4.patch, NUTCH-1031-trunk.v5.patch, NUTCH-1031.v1.patch
>
>
> We're about to release the first version of Crawler-Commons 
> [http://code.google.com/p/crawler-commons/] which contains a parser for 
> robots.txt files. This parser should also be better than the one we currently 
> have in Nutch. I will delegate this functionality to CC as soon as it is 
> available publicly

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Comment Edited] (NUTCH-1031) Delegate parsing of robots.txt to crawler-commons

2013-03-08 Thread Lewis John McGibbney (JIRA)

[ 
https://issues.apache.org/jira/browse/NUTCH-1031?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13597515#comment-13597515
 ] 

Lewis John McGibbney edited comment on NUTCH-1031 at 3/8/13 8:26 PM:
-

Hi Tejas. Sorry for taking forever to get around to this. 

* I really like to documentation within the patch. Big +1 for this
* Test all pass flawlessly.
* I like the retention of the main() method in o.a.n.p.RobotRulesParser

I've tested this on several websites, including many directories within sites 
like bbc.co.uk (check out the robots.txt)
I am +1 for this Tejas. Good work on this one, its been a long time in coming 
to Nutch.
I am keen to hear from others.

I have one trivial grudge, there is a typo in the usage message for the main 
method in RobotRulesParser. It should be 

{code}
Usage: RobotRulesParser   
{code}

instead of 

{code}
Usage: RobotRulesParser   
{code} 

  was (Author: lewismc):
Hi Tejas. Sorry for taking forever to get around to this. 

* I really like to documentation within the patch. Big +1 for this
* Test all pass flawlessly.
* I like the retention of the main() method in o.a.n.p.RobotRulesParser

I've tested this on several websites, including many directories within sites 
like bbc.co.uk (check out the robots.txt)
I am +1 for this Tejas. Good work on this one, its been a long time in coming 
to Nutch.
I am keen to hear from others.
  
> Delegate parsing of robots.txt to crawler-commons
> -
>
> Key: NUTCH-1031
> URL: https://issues.apache.org/jira/browse/NUTCH-1031
> Project: Nutch
>  Issue Type: Task
>Reporter: Julien Nioche
>Assignee: Tejas Patil
>Priority: Minor
>  Labels: robots.txt
> Fix For: 1.7
>
> Attachments: CC.robots.multiple.agents.patch, 
> CC.robots.multiple.agents.v2.patch, NUTCH-1031-trunk.v2.patch, 
> NUTCH-1031-trunk.v3.patch, NUTCH-1031-trunk.v4.patch, NUTCH-1031.v1.patch
>
>
> We're about to release the first version of Crawler-Commons 
> [http://code.google.com/p/crawler-commons/] which contains a parser for 
> robots.txt files. This parser should also be better than the one we currently 
> have in Nutch. I will delegate this functionality to CC as soon as it is 
> available publicly

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Comment Edited] (NUTCH-1031) Delegate parsing of robots.txt to crawler-commons

2013-03-08 Thread Lewis John McGibbney (JIRA)

[ 
https://issues.apache.org/jira/browse/NUTCH-1031?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13597515#comment-13597515
 ] 

Lewis John McGibbney edited comment on NUTCH-1031 at 3/8/13 8:24 PM:
-

Hi Tejas. Sorry for taking forever to get around to this. 

* I really like to documentation within the patch. Big +1 for this
* Test all pass flawlessly.
* I like the retention of the main() method in o.a.n.p.RobotRulesParser

I've tested this on several websites, including many directories within sites 
like bbc.co.uk (check out the robots.txt)
I am +1 for this Tejas. Good work on this one, its been a long time in coming 
to Nutch.
I am keen to hear from others.

  was (Author: lewismc):
Hi Tejas. Sorry for taking forever to get around to this. 
* I really like to documentation within the patch. Big +1 for this
* Test all pass flawlessly.
* I like the retention of the main() method in o.a.n.p.RobotRulesParser
I've tested this on several websites, including many directories within sites 
like bbc.co.uk (check out the robots.txt)
I am +1 for this Tejas. Good work on this one, its been a long time in coming 
to Nutch.
I am keen to hear from others.
  
> Delegate parsing of robots.txt to crawler-commons
> -
>
> Key: NUTCH-1031
> URL: https://issues.apache.org/jira/browse/NUTCH-1031
> Project: Nutch
>  Issue Type: Task
>Reporter: Julien Nioche
>Assignee: Tejas Patil
>Priority: Minor
>  Labels: robots.txt
> Fix For: 1.7
>
> Attachments: CC.robots.multiple.agents.patch, 
> CC.robots.multiple.agents.v2.patch, NUTCH-1031-trunk.v2.patch, 
> NUTCH-1031-trunk.v3.patch, NUTCH-1031-trunk.v4.patch, NUTCH-1031.v1.patch
>
>
> We're about to release the first version of Crawler-Commons 
> [http://code.google.com/p/crawler-commons/] which contains a parser for 
> robots.txt files. This parser should also be better than the one we currently 
> have in Nutch. I will delegate this functionality to CC as soon as it is 
> available publicly

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Comment Edited] (NUTCH-1031) Delegate parsing of robots.txt to crawler-commons

2013-01-21 Thread Tejas Patil (JIRA)

[ 
https://issues.apache.org/jira/browse/NUTCH-1031?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13559332#comment-13559332
 ] 

Tejas Patil edited comment on NUTCH-1031 at 1/22/13 3:18 AM:
-

Added a patch for nutch trunk (NUTCH-1031-trunk.v2.patch). If nobody has 
objection, i will work on corresponding patch for 2.x.
Summary of the changes done:
- Removed RobotRules class as CC provides a replacement: BaseRobotRules
- Moved RobotRulesParser from http plugin in account to NUTCH-1513, other 
protocols might share it.
- Added HttpRobotRulesParser which will be responsible for getting the robots 
file for http protocol.
- Changed references from old nutch classes to classes from CC.

  was (Author: tejasp):
Added a patch for nutch trunk (NUTCH-1031-trunk.v2.patch). If nobody has 
objection, i will work on corresponding patch for 2.x.
Summary of the changes done:
- Removed RobotRules class as CC provides a replacement: BaseRobotRules
- Moved RobotRulesParser from http plugin in account to NUTCH-1513, other 
protocols might share the it.
- Added HttpRobotRulesParser which will be responsible for getting the robots 
file using http protocol.
- Changed references from old nutch classes to classes from CC.
  
> Delegate parsing of robots.txt to crawler-commons
> -
>
> Key: NUTCH-1031
> URL: https://issues.apache.org/jira/browse/NUTCH-1031
> Project: Nutch
>  Issue Type: Task
>Reporter: Julien Nioche
>Assignee: Tejas Patil
>Priority: Minor
>  Labels: robots.txt
> Fix For: 1.7
>
> Attachments: CC.robots.multiple.agents.patch, 
> CC.robots.multiple.agents.v2.patch, NUTCH-1031-trunk.v2.patch, 
> NUTCH-1031.v1.patch
>
>
> We're about to release the first version of Crawler-Commons 
> [http://code.google.com/p/crawler-commons/] which contains a parser for 
> robots.txt files. This parser should also be better than the one we currently 
> have in Nutch. I will delegate this functionality to CC as soon as it is 
> available publicly

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira