Do /user/root/url exist, have you uploaded  the url folder to you dfs system?

bin/hadoop dfs -mkdir urls
bin/hadoop dfs -copyFromLocal urls.txt urls/urls.txt

or

bin/hadoop -put <localsrc> <dst>


Mohan Lal wrote:
> Hi all,
>
> While iam try to crawl using distributed machines its throw an error
>
> bin/nutch crawl urls -dir crawl -depth 10 -topN 50
> crawl started in: crawl
> rootUrlDir = urls
> threads = 10
> depth = 10
> topN = 50
> Injector: starting
> Injector: crawlDb: crawl/crawldb
> Injector: urlDir: urls
> Injector: Converting injected urls to crawl db entries.
> Exception in thread "main" java.io.IOException: Input directory
> /user/root/urls in localhost:9000 is invalid.
>         at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:274)
>         at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:327)
>         at org.apache.nutch.crawl.Injector.inject(Injector.java:138)
>         at org.apache.nutch.crawl.Crawl.main(Crawl.java:105)
>
> whats wrong with my configuration,  please help  me..................
>
>
> Regards
> Mohan Lal 
>   


-------------------------------------------------------------------------
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys -- and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
_______________________________________________
Nutch-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/nutch-general

Reply via email to