>
> Hi,
> I recently upgraded from cloudera's cdh3 to cdh4 (Hadoop 0.23) version,
> im trying to check if file exists on hdfs or not using the following code
> (code was working fine in cdh3):
>
>     Path path = new Path(p);
>             if (!fileSystem.exists(path)) {
>                 fileSystem.mkdirs(path);
>
>             }
>
> Im getting the following exception:
>
> java.io.EOFException
>     at java.io.DataInputStream.readInt(DataInputStream.java:392)
>     at
> org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:848)
>     at org.apache.hadoop.ipc.Client$Connection.run(Client.java:780)
> 18:04:15,103 DEBUG Client:923 - IPC Client (900594179) connection to /
> 192.168.3.86:8020 from admin: closed
> 18:04:15,103 DEBUG Client:793 - IPC Client (900594179) connection to /
> 192.168.3.86:8020 from admin: stopped, remaining connections 0
> java.io.IOException: Failed on local exception: java.io.EOFException; Host
> Details : local host is: "PC/192.168.3.58"; destination host is: ""
> cloud2.com":8020;
>     at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:758)
>     at org.apache.hadoop.ipc.Client.call(Client.java:1163)
>     at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:188)
>     at $Proxy9.getFileInfo(Unknown Source)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>     at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:601)
>     at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:165)
>     at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:84)
>     at $Proxy9.getFileInfo(Unknown Source)
>     at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:622)
>     at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1344)
>     at
> org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:718)
>     at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1165)
>     at
> com.cloudmatix.bl.WorkflowServiceManager.createAppDir(WorkflowServiceManager.java:210)
>     at
> com.cloudmatix.bl.WorkflowServiceManager.create(WorkflowServiceManager.java:101)
>     at
> com.cloudmatix.bl.WorkflowServiceManager.main(WorkflowServiceManager.java:167)
> Caused by: java.io.EOFException
>     at java.io.DataInputStream.readInt(DataInputStream.java:392)
>     at
> org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:848)
>     at org.apache.hadoop.ipc.Client$Connection.run(Client.java:780)
>
>
> and name node log shows the following:
>
> IPC Server listener on 8020: readAndProcess threw exception 
> com.google.protobuf.InvalidProtocolBufferException: Protocol message 
> contained an invalid tag (zero). from client 192.168.3.58. Count of bytes 
> read: 0
> com.google.protobuf.InvalidProtocolBufferException: Protocol message 
> contained an invalid tag (zero).
>       at 
> com.google.protobuf.InvalidProtocolBufferException.invalidTag(InvalidProtocolBufferException.java:68)
>       at 
> com.google.protobuf.CodedInputStream.readTag(CodedInputStream.java:108)
>       at 
> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcPayloadHeaderProto$Builder.mergeFrom(RpcPayloadHeaderProtos.java:628)
>       at 
> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcPayloadHeaderProto$Builder.mergeFrom(RpcPayloadHeaderProtos.java:495)
>       at 
> com.google.protobuf.AbstractMessageLite$Builder.mergeFrom(AbstractMessageLite.java:212)
>       at 
> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:746)
>       at 
> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:238)
>       at 
> com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:282)
>       at 
> com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:760)
>       at 
> com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:288)
>       at 
> com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:752)
>       at 
> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcPayloadHeaderProto.parseDelimitedFrom(RpcPayloadHeaderProtos.java:452)
>       at org.apache.hadoop.ipc.Server$Connection.processData(Server.java:1556)
>       at 
> org.apache.hadoop.ipc.Server$Connection.processOneRpc(Server.java:1541)
>       at 
> org.apache.hadoop.ipc.Server$Connection.readAndProcess(Server.java:1395)
>       at org.apache.hadoop.ipc.Server$Listener.doRead(Server.java:710)
>       at 
> org.apache.hadoop.ipc.Server$Listener$Reader.doRunLoop(Server.java:509)
>       at org.apache.hadoop.ipc.Server$Listener$Reader.run(Server.java:484)
>
> Can anybody help me out,
> Thank you
>
>


-- 
Regards
Akhtar Muhammad Din

Reply via email to