[ https://issues.apache.org/jira/browse/HADOOP-7047?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hairong Kuang resolved HADOOP-7047. ----------------------------------- Resolution: Invalid I just realized that this problem has already been fixed in the trunk. The fix is in Connection#run. Good job, the Hadoop community! > RPC client gets stuck > --------------------- > > Key: HADOOP-7047 > URL: https://issues.apache.org/jira/browse/HADOOP-7047 > Project: Hadoop Common > Issue Type: Bug > Components: ipc > Reporter: Hairong Kuang > Assignee: Hairong Kuang > Fix For: 0.22.0 > > Attachments: trunkStuckClient.patch > > > One of the dfs clients in our cluster stuck on waiting for a RPC result. > However the IPC connection thread who is receiving the RPC result died on OOM > error: > INFO >> Exception in thread "IPC Client (47) connection to XX from root" > java.lang.OutOfMemoryError: Java heap space > INFO >> at java.util.Arrays.copyOfRange(Arrays.java:3209) > INFO >> at java.lang.String.<init>(String.java:216) > INFO >> at java.lang.StringBuffer.toString(StringBuffer.java:585) > INFO >> at java.net.URI.toString(URI.java:1907) > INFO >> at java.net.URI.<init>(URI.java:732) > INFO >> at org.apache.hadoop.fs.Path.initialize(Path.java:137) > INFO >> at org.apache.hadoop.fs.Path.<init>(Path.java:126) > INFO >> at org.apache.hadoop.fs.FileStatus.readFields(FileStatus.java:206) > INFO >> at > org.apache.hadoop.io.ObjectWritable.readObject(ObjectWritable.java:237) > INFO >> at > org.apache.hadoop.io.ObjectWritable.readObject(ObjectWritable.java:171) > INFO >> at > org.apache.hadoop.io.ObjectWritable.readObject(ObjectWritable.java:219) > INFO >> at > org.apache.hadoop.io.ObjectWritable.readFields(ObjectWritable.java:66) > INFO >> at > org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:531) > INFO >> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:466) -- This message is automatically generated by JIRA. - You can reply to this email to add a comment to the issue online.