hi all,
       I had run a crawl of approxmately 40,000 urls . It stop in between
giving an error of no disk available. Is there any way to restrict the size
of segements so that only a few MB goes in paticular segment .
thanks in advance.
-- 
View this message in context: 
http://www.nabble.com/how-to-restrict-the-size-of-segments-tf3394948.html#a9451340
Sent from the Nutch - User mailing list archive at Nabble.com.


-------------------------------------------------------------------------
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys-and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
_______________________________________________
Nutch-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/nutch-general

Reply via email to