Hi all

I want to run two crawlers using single server at the same time across
different seed lists.

The question is

Is it safe to use one binaries? I have developed scripts to specify
different input/output locations but I wonder if nutch creates some
temporarily folders during its work which I cannot control and so it would
be possible situation when two crawlers overlap working data.

Thanks
Alexander Aristov

Reply via email to