I found the ptoblem, the host name was not setup in DNS yet, and
obviously, the hadoop throw exception when it failed to get locla host.
Once I added the host name to hosts file, it worked.

On Tue, 2007-07-24 at 21:54 -0400, kevin chen wrote:
> I just switched to a new host, the inject failed. What could be the
> problem?  
> I am using nutch-0.9 release, and everything worked fine with another
> host.
> 
> bin/nutch inject crawl/crawldb urls/
> 
> Injector: starting
> Injector: crawlDb: crawl/crawldb
> Injector: urlDir: urls
> Injector: Converting injected urls to crawl db entries.
> Injector: java.io.IOException: Job failed!
>         at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:604)
>         at org.apache.nutch.crawl.Injector.inject(Injector.java:162)
>         at org.apache.nutch.crawl.Injector.run(Injector.java:192)
>         at org.apache.hadoop.util.ToolBase.doMain(ToolBase.java:189)
>         at org.apache.nutch.crawl.Injector.main(Injector.java:182)
> 
> 


-------------------------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems?  Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >>  http://get.splunk.com/
_______________________________________________
Nutch-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/nutch-general

Reply via email to