Hi to al, is there any way fo deleting urls from db ?. The necesity arises
because I need to crawl within a counry ip-range.
My idea was after each update, get a dump of urls form crawlb, getting there
ip (by means of a linux script or something )
and deleting the anwanted urls for the next generate cycle. I'm not familiar
with the nutch code to make a plugin for that.
I preciate any help
-------------------------------------------------------------------------
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/
_______________________________________________
Nutch-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/nutch-general

Reply via email to