This is a known issue https://issues.apache.org/jira/browse/SPARK-9844.  As
Noorul said, it is probably safe to ignore as the executor process is
already destroyed at this point.

On Mon, Dec 21, 2015 at 8:54 PM, Noorul Islam K M <noo...@noorul.com> wrote:

> carlilek <carli...@janelia.hhmi.org> writes:
>
> > My users use Spark 1.5.1 in standalone mode on an HPC cluster, with a
> > smattering still using 1.4.0
> >
> > I have been getting reports of errors like this:
> >
> > 15/12/21 15:40:33 ERROR FileAppender: Error writing stream to file
> > /scratch/spark/work/app-20151221150645-0000/3/stdout
> > java.io.IOException: Stream closed
> >   at
> java.io.BufferedInputStream.getBufIfOpen(BufferedInputStream.java:162)
> >   at java.io.BufferedInputStream.read1(BufferedInputStream.java:272)
> >   at java.io.BufferedInputStream.read(BufferedInputStream.java:334)
> >   at java.io.FilterInputStream.read(FilterInputStream.java:107)
> >   at
> >
> org.apache.spark.util.logging.FileAppender.appendStreamToFile(FileAppender.scala:70)
> >   at
> >
> org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply$mcV$sp(FileAppender.scala:39)
> >   at
> >
> org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply(FileAppender.scala:39)
> >   at
> >
> org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply(FileAppender.scala:39)
> >   at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1699)
> >   at
> >
> org.apache.spark.util.logging.FileAppender$$anon$1.run(FileAppender.scala:38)
> > '
> >
> > So far I have been unable to reproduce reliably, but does anyone have any
> > ideas?
> >
>
> I have seen this happening in our cluster also. So far I have been
> ignoring this.
>
> Thanks and Regards
> Noorul
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to