Hi folks,
I've setup Nutch 1.12 and Solr 6.1. Using the crawl script as defined in the tutorial but however the last command in the script i.e. nutch clean is throughing the following exception: Cleaning up index if possible /opt/nutch-latest/bin/nutch clean -Dsolr.server.url=http://192.168.99.100:8983/solr/test/ crawl/crawldb SolrIndexer: deleting 2/2 documents ERROR CleaningJob: java.io.IOException: Job failed! at org.apache.nutch.indexer.CleaningJob.delete(CleaningJob.java:172) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70):195) at org.apache.nutch.indexer.CleaningJob.main(CleaningJob.java:206) Error running: /opt/nutch-latest/bin/nutch clean -Dsolr.server.url=http://192.168.99.100:8983/solr/test/ crawl/crawldb Failed with exit value 255. I ran the following command: [root@2a563cff0511 nutch-latest]# bin/crawl -i \ > -D solr.server.url=http://192.168.99.100:8983/solr/bt-business/ urls/ crawl1 1 Nutch is installed the following environment: * Centos 6.8 * Java x64 1.7.0_79 * Nutch 1.12 * Solr 6.1 (installed in a different instance of docker) However, the previous commands that are in the script executed successfully and I'm able to peform query in Solr. Any help is highly appreciated. Regards, Munim