kamatsuoka,

Thanks. I thought I had checked IOUtils bundled with spark, but,
apparently, I missed that it didn't have a *closeQuietly(Closeable)*.

We also figured out why this doesn't happen when the task is reattempted.
With debug logging enabled. we saw the following line printed just before
the closeQuietly exception.

14/02/15 10:51:27 DEBUG DFSClient: Error making BlockReader. Closing stale
NioInetPeer(Socket[addr=/10.0.2.224,port=50010,localport=42418])
java.io.EOFException: Premature EOF: no length prefix available
        at
org.apache.hadoop.hdfs.protocol.HdfsProtoUtil.vintPrefixed(HdfsProtoUtil.java:171)
        at
org.apache.hadoop.hdfs.RemoteBlockReader2.newBlockReader(RemoteBlockReader2.java:383)
        at
org.apache.hadoop.hdfs.BlockReaderFactory.newBlockReader(BlockReaderFactory.java:136)
        at
org.apache.hadoop.hdfs.DFSInputStream.getBlockReader(DFSInputStream.java:993)

IOUtils.closeQuietly is called in the catch block handling the
EOFException. I'm not quite sure how DFSClient works, but it appears to
expect to fail sometimes while building a block reader and so it makes
multiple attempts, which is why we don't see task reattempts failing.

We're moving to spark-0.9.0 as well.

Roshan



On Sat, Feb 15, 2014 at 6:34 AM, kamatsuoka <ken...@gmail.com> wrote:

> I verified that this problem doesn't happen under spark 0.9.0.
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-org-apache-commons-io-IOUtils-closeQuietly-with-cdh4-binary-tp204p1541.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>

Reply via email to