I tried this by recompiling once again. I end up in same problem. Not sure what am i missing? Any hints or clues?
Regards Ram On Fri, Feb 22, 2013 at 7:56 AM, ramkrishna vasudevan < [email protected]> wrote: > YEs Ted. I have two different setups with me . One compiled with hadoop > 2.0 and other with 1.0 profile. > > Regards > Ram > > > On Thu, Feb 21, 2013 at 10:30 PM, Ted Yu <[email protected]> wrote: > >> bq. But when i tried with hadoop - 1.0.4 it worked fine. >> >> Just a guess: did you recompile code with hadoop 2.0 profile before >> trying ? >> The fact that initialization failed with PB exception led to the above >> question. >> >> Cheers >> >> On Thu, Feb 21, 2013 at 7:24 AM, ramkrishna vasudevan < >> [email protected]> wrote: >> >> > Hi Devs >> > >> > I tried to run HBase current trunk snapshot with Hadoop 2.0.3 alpha. >> > >> > I got the following exception >> > java.io.IOException: Failed on local exception: >> > com.google.protobuf.InvalidProtocolBufferException: Message missing >> > required fields: callId, status; Host Details : local host is: "ram/ >> > 10.239.47.144"; destination host is: "localhost":9000; >> > at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:760) >> > at org.apache.hadoop.ipc.Client.call(Client.java:1168) >> > at >> > >> > >> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202) >> > at $Proxy10.setSafeMode(Unknown Source) >> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >> > at >> > >> > >> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) >> > at >> > >> > >> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) >> > at java.lang.reflect.Method.invoke(Method.java:597) >> > at >> > >> > >> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164) >> > at >> > >> > >> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83) >> > at $Proxy10.setSafeMode(Unknown Source) >> > at >> > >> > >> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.setSafeMode(ClientNamenodeProtocolTranslatorPB.java:514) >> > at org.apache.hadoop.hdfs.DFSClient.setSafeMode(DFSClient.java:1896) >> > at >> > >> > >> org.apache.hadoop.hdfs.DistributedFileSystem.setSafeMode(DistributedFileSystem.java:660) >> > at org.apache.hadoop.hbase.util.FSUtils.isInSafeMode(FSUtils.java:261) >> > at org.apache.hadoop.hbase.util.FSUtils.waitOnSafeMode(FSUtils.java:650) >> > at >> > >> > >> org.apache.hadoop.hbase.master.MasterFileSystem.checkRootDir(MasterFileSystem.java:389) >> > at >> > >> > >> org.apache.hadoop.hbase.master.MasterFileSystem.createInitialFileSystemLayout(MasterFileSystem.java:147) >> > at >> > >> > >> org.apache.hadoop.hbase.master.MasterFileSystem.<init>(MasterFileSystem.java:131) >> > at >> > >> > >> org.apache.hadoop.hbase.master.HMaster.finishInitialization(HMaster.java:654) >> > at org.apache.hadoop.hbase.master.HMaster.run(HMaster.java:476) >> > at java.lang.Thread.run(Thread.java:662) >> > Caused by: com.google.protobuf.InvalidProtocolBufferException: Message >> > missing required fields: callId, status >> > at >> > >> > >> com.google.protobuf.UninitializedMessageException.asInvalidProtocolBufferException(UninitializedMessageException.java:81) >> > at >> > >> > >> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto$Builder.buildParsed(RpcPayloadHeaderProtos.java:1094) >> > at >> > >> > >> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto$Builder.access$1300(RpcPayloadHeaderProtos.java:1028) >> > at >> > >> > >> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcPayloadHeaderProtos.java:986) >> > at >> org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:886) >> > at org.apache.hadoop.ipc.Client$Connection.run(Client.java:817) >> > 2013-02-20 20:44:01,928 INFO org.apache.hadoop.hbase.master.HMaster: >> > Aborting >> > >> > I tried if there was something similar raised in the dev list. Could >> not >> > find one. >> > But when i tried with hadoop - 1.0.4 it worked fine. >> > Did anyone face this problem? >> > >> > Regards >> > Ram >> > >> > >
