Hi dudes, I'm crawling severals sites in parallel and if I crawl with the default nutch_conf I've problems writting at the same hadoop.log. Now I create a nutch_conf_dir per site and set a new hadoop.log.dir per site in log4j.properties, the problem is when i'm running a crawl and export the new NUTCH_CONF_DIR still writting in the default NUTCH_CONF. Thanks in advance! -- View this message in context: http://www.nabble.com/hadoop.log-in-parallel-crawling-tp23811444p23811444.html Sent from the Nutch - User mailing list archive at Nabble.com.
