Hi all,
I am running nutch-0.9 with hadoop-0.9.1.
While starting the name node i was getting an exception and after that name
node was not responding to any of the client's request.

When i searched forum in this regard ,i found a patch on
*HADOOP-745<http://issues.apache.org/jira/browse/HADOOP-745>
.
*I applied the successfully and compiled the 0.9.1's code. and then replaced
the hadoop-0.9.1.jar with the hadoop-0.9.2-dev.jar.

Again i tried to start the dfs by calling start.dfs and still i am getting
the same error on same filesystem. I don't want to use the same filesystem
as it already contains some data.

Does anybody have the idea in this regards??

Thanks and regards,
~Shailendra
-------------------------------------------------------------------------
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys - and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
_______________________________________________
Nutch-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/nutch-general

Reply via email to