Hi

This *java.nio.channels.ClosedChannelException* is often caused by a
connection timeout
between your Spark executors and Alluxio workers.
One simple and quick fix is to increase the timeout value to be larger
alluxio.user.network.netty.timeout
<https://docs.alluxio.io/os/user/stable/en/reference/Properties-List.html#alluxio.user.network.netty.timeout>
in
your Spark jobs.

Checkout how to run Spark with customized alluxio properties
<https://docs.alluxio.io/os/user/stable/en/compute/Spark.html?utm_source=spark&utm_medium=mailinglist>
.

- Bin


On Thu, May 9, 2019 at 4:39 AM u9g <lwx371...@163.com> wrote:

> Hey,
>
> When I run Spark on Alluxio, I encounter the following error. How can I
> fix this? Thanks
>
> Lost task 63.0 in stage 0.0 (TID 63, 172.28.172.165, executor 7):
> java.io.lOException: java.util.concurrent.ExecutionExcep tion:
> java.nio.channels.ClosedC hannelException
>
> Best,
> Andy Li
>
>
>
>

Reply via email to