I have found the problem described in the previous post. It seems to be related to the following Java, Solaris specific bug : http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=6322825 http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=6322825 .
There are two workarounds suggested. One of them was already in place in my set up, namely the hard limit of the number of file descriptors. However the problem was still occurring. Using the java system parameter as suggested though, has sorted out the problem. (i.e. -Djava.nio.channels.spi.SelectorProvider=sun.nio.ch.PollSelectorProvider) Thanks, Alexandra. -- View this message in context: http://www.nabble.com/Hadoop-0.19.1-with--d64-option-on-Solaris-5.10-and-java6-doesn%27t-start-with-exception-%22java.io.IOException%3A-Invalid-argument%22-tp23451755p23523214.html Sent from the Hadoop core-user mailing list archive at Nabble.com.