Oh I'm sorry, my symlink was named Hadoop 0.20.2, actually it was the append version ( 0.20-append-r1056497, r1056491 ) which should be compatible with HBase. That is my flaw. I test with the real 0.20.2 again.
2011/9/13 Thomas Jungblut <[email protected]> > Hi zhaoguo, > > officially we just support Hadoop 0.20.2 for Hama 0.3.0-incubating. > Every appending release of Hadoop e.G. 0.20.203 or 0.21.0 has changed their > RPC protocol. This is the reason why you got the version mismatch in your > header. > I'm pretty sure we build against 0.20.2. Although I get the same errors: > > org.apache.hadoop.ipc.RPC$VersionMismatch: Protocol >> org.apache.hadoop.hdfs.protocol.ClientProtocol version mismatch. (client = >> 41, server = 42) >> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:364) >> at >> org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106) >> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207) >> > > And in my lib directory lies a "hadoop-core-0.20.2.jar". Strange. > > Thanks for your observation, I'll take a deeper look into it and provide > you with a fix if I have one. > > > 2011/9/13 zhaoguo wang <[email protected]> > >> Hello everyone: >> >> I'm new to both HAMA and Hadoop. When I try to set up hama with hdfs >> on the same single node. I have the following problems: First I tried >> hadoop-0.20.203.0 >> with hama-0.3.0-incubating, when hama's bsp was setting up and tried >> to communicate >> with the namenode of HDFS. The connection is refused on the side of hama. >> Then I check the log of the HDFS, it has the following infomation: >> Incorrect header or version mismatch from 127.0.0.1:52772 got version >> 3 expected version 4. >> >> To avoid the mismatch problem, I also tried hadoop-0.20.2 and >> hadoop-0.21.0. >> The same problem happened again when I used hama with hadoop-0.20.2. >> >> When I use hadoop-0.21.0, the problem changed. >> The following exception is thrown in the log of hdfs instead of >> mismatch problem: >> >> 2011-09-13 14:37:24,264 INFO org.apache.hadoop.ipc.Server: IPC Server >> listener on 9000: readAndProcess threw exception java.io.IOException: >> Unable to read authentication method. Count of bytes read: 0 >> java.io.IOException: Unable to read authentication method >> at >> org.apache.hadoop.ipc.Server$Connection.readAndProcess(Server.java:1079) >> at org.apache.hadoop.ipc.Server$Listener.doRead(Server.java:525) >> at org.apache.hadoop.ipc.Server$Listener$Reader.run(Server.java:332) >> at >> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110) >> at >> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603) >> at java.lang.Thread.run(Thread.java:636) >> >> I have no idea, what happened. Anyone can give me some suggestions? >> -- >> Zhaoguo Wang, Parallel Processing Institute, Fudan University >> >> Address: Room 320, Software Building, 825 Zhangheng Road, Shanghai, China >> >> [email protected] >> http://ppi.fudan.edu.cn/zhaoguo_wang >> > > > > -- > Thomas Jungblut > Berlin > > mobile: 0170-3081070 > > business: [email protected] > private: [email protected] > -- Thomas Jungblut Berlin mobile: 0170-3081070 business: [email protected] private: [email protected]
