Yes, if you have a pseudo-cluster on that machine, and configure enough Map/Reduce capacity.
On Sun, Nov 14, 2010 at 12:20 AM, Jitendra <[email protected]> wrote: > > Hi, > > Wanted to add one more question on this. Can nutch run multiple jobs in > parallel on same machine. > I have changed nutch to use different crawldb and url directory for > different jobs. > > Thanks > > On Fri, Nov 12, 2010 at 5:22 PM, Birger Lie [via Lucene] < > [email protected]<ml-node%[email protected]> >> wrote: > >> can be distributed on several machines (as many as you like) >> does support robots txt >> >> >> - Birger >> >> On Nov 12, 2010, at 12:34 PM, mohammad amin golshani wrote: >> >> > does nutch have the ability run on multiple machine? >> >> >> >> ------------------------------ >> View message @ >> http://lucene.472066.n3.nabble.com/can-nutch-s-crawler-run-parallel-tp1888331p1888409.html >> To start a new topic under Nutch - User, email >> [email protected]<ml-node%[email protected]> >> To unsubscribe from Nutch - User, click >> here<http://lucene.472066.n3.nabble.com/template/TplServlet.jtp?tpl=unsubscribe_by_code&node=603147&code=amVldC5sb3Zlc0BnbWFpbC5jb218NjAzMTQ3fC0xMDg2ODAyNDgy>. >> >> >> > > > -- > Thanks and regards > > Jitendra Singh > > -- > View this message in context: > http://lucene.472066.n3.nabble.com/can-nutch-s-crawler-run-parallel-tp1888331p1895173.html > Sent from the Nutch - User mailing list archive at Nabble.com. >

