Pl check your classpath entries.
Looks like hadoop-core jar before you shutdown the cluster and after u changed hadoop-env.sh are different

-Sagar

Songting Chen wrote:
Hi,
I modified the classpath in hadoop-env.sh in namenode and datanodes before shutting down the cluster. Then problem appears: I cannot stop hadoop cluster at all. The stop-all.sh shows no datanode/namenode, while all the java processes are running. So I manually killed the java process. Now the namenode seems to be corrupted and always stays in Safe mode, while the datanodes complain the following weird error:

2008-10-27 17:28:44,141 FATAL org.apache.hadoop.dfs.DataNode: Incompatible 
build versions: namenode BV = ; datanode BV = 694836
2008-10-27 17:28:44,244 ERROR org.apache.hadoop.dfs.DataNode: 
java.io.IOException: Incompatible build versions: namenode BV = ; datanode BV = 
694836
        at org.apache.hadoop.dfs.DataNode.handshake(DataNode.java:403)
        at org.apache.hadoop.dfs.DataNode.startDataNode(DataNode.java:250)
        at org.apache.hadoop.dfs.DataNode.<init>(DataNode.java:190)
        at org.apache.hadoop.dfs.DataNode.makeInstance(DataNode.java:2987)
        at 
org.apache.hadoop.dfs.DataNode.instantiateDataNode(DataNode.java:2942)
        at org.apache.hadoop.dfs.DataNode.createDataNode(DataNode.java:2950)
        at org.apache.hadoop.dfs.DataNode.main(DataNode.java:3072)

  My question is how to recover from such failure. And I guess the correct 
practice for changing the CLASSPATH is to shut down the cluster, apply the 
change, restart the cluster.

Thanks,
-Songting

Reply via email to