Hi,

I just faced an issue this morning on one of my RS.

Here is an extract of the logs
2013-04-09 11:05:33,164 ERROR org.apache.hadoop.hdfs.DFSClient:
Exception closing file
/hbase/entry_proposed/ae4a5d72d4613728ddbcc5a64262371b/.tmp/ed6a0154ef714cd88faf26061cf248d3
: java.net.SocketException: Too many open files
java.net.SocketException: Too many open files
        at sun.nio.ch.Net.socket0(Native Method)
        at sun.nio.ch.Net.socket(Net.java:323)
        at sun.nio.ch.Net.socket(Net.java:316)
        at sun.nio.ch.SocketChannelImpl.<init>(SocketChannelImpl.java:101)
        at 
sun.nio.ch.SelectorProviderImpl.openSocketChannel(SelectorProviderImpl.java:60)
        at java.nio.channels.SocketChannel.open(SocketChannel.java:142)
        at 
org.apache.hadoop.net.StandardSocketFactory.createSocket(StandardSocketFactory.java:58)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.createBlockOutputStream(DFSClient.java:3423)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:3381)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2600(DFSClient.java:2589)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2829)

ulimit is unlimited on all my servers.

Is seems there was to many network connections opened. Is there
anything HBase can handle in such scenario? It's only hadoop in the
stacktrace, so I'm not sure.

Can this be related to nproc? I don't think so. I have another tool
running on the RS. Using low CPU, low bandwidth but making MANY
network HTTP connections...

Any suggestion?

JM

Reply via email to