[ 
https://issues.apache.org/jira/browse/SPARK-12876?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15106111#comment-15106111
 ] 

jeffonia Tung commented on SPARK-12876:
---------------------------------------

I've tested it's still happen in 1.4.0, and this time in the role of driver, 
not while the worker shutdown. I've also learned that it's already been fixed 
at 1.6.0 with https://github.com/apache/spark/pull/10714, so i'm wondering if 
this problem will be fixed either, after catching the exception at 
inputStream.read call of the FileAppender.

My bad, i'm intend to list the problem and link with the SPARK-4300, so we can 
deal with the problem together.

> Race condition when driver rapidly shutdown after started.
> ----------------------------------------------------------
>
>                 Key: SPARK-12876
>                 URL: https://issues.apache.org/jira/browse/SPARK-12876
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.4.0
>            Reporter: jeffonia Tung
>            Priority: Minor
>
> It's a little same as the issue: SPARK-4300. Well, this time, it's happen on 
> the driver occasionally.
> [INFO 2016-01-18 17:12:35 (Logging.scala:59)] Asked to launch driver 
> driver-20160118171237-0009
> [INFO 2016-01-18 17:12:35 (Logging.scala:59)] Copying user jar 
> file:/data/dbcenter/cdh5/spark-1.4.0-bin-hadoop2.4/mylib/spark-ly-streaming-v2-201601141018.jar
>  to /data/dbcenter/cdh5/spark-1.4.0-bin-hado
> op2.4/work/driver-20160118171237-0009/spark-ly-streaming-v2-201601141018.jar
> [INFO 2016-01-18 17:12:35 (Logging.scala:59)] Copying 
> /data/dbcenter/cdh5/spark-1.4.0-bin-hadoop2.4/mylib/spark-ly-streaming-v2-201601141018.jar
>  to /data/dbcenter/cdh5/spark-1.4.0-bin-hadoop2.4/work/dri
> ver-20160118171237-0009/spark-ly-streaming-v2-201601141018.jar
> [INFO 2016-01-18 17:12:35 (Logging.scala:59)] Launch Command: 
> "/data/dbcenter/jdk1.7.0_79/bin/java" "-cp" 
> ....."org.apache.spark.deploy.worker.DriverWrapper"......
> [INFO 2016-01-18 17:12:39 (Logging.scala:59)] Asked to launch executor 
> app-20160118171240-0256/15 for DirectKafkaStreamingV2
> [INFO 2016-01-18 17:12:39 (Logging.scala:59)] Launch command: 
> "/data/dbcenter/jdk1.7.0_79/bin/java" "-cp"  
> ....."org.apache.spark.executor.CoarseGrainedExecutorBackend"......
> [INFO 2016-01-18 17:12:49 (Logging.scala:59)] Asked to kill driver 
> driver-20160118164724-0008
> [INFO 2016-01-18 17:12:49 (Logging.scala:59)] Redirection to 
> /data/dbcenter/cdh5/spark-1.4.0-bin-hadoop2.4/work/driver-20160118164724-0008/stdout
>  closed: Stream closed
> [INFO 2016-01-18 17:12:49 (Logging.scala:59)] Asked to kill executor 
> app-20160118164728-0250/15
> [INFO 2016-01-18 17:12:49 (Logging.scala:59)] Runner thread for executor 
> app-20160118164728-0250/15 interrupted
> [INFO 2016-01-18 17:12:49 (Logging.scala:59)] Killing process!
> [ERROR 2016-01-18 17:12:49 (Logging.scala:96)] Error writing stream to file 
> /data/dbcenter/cdh5/spark-1.4.0-bin-hadoop2.4/work/app-20160118164728-0250/15/stdout
> java.io.IOException: Stream closed
>         at 
> java.io.BufferedInputStream.getBufIfOpen(BufferedInputStream.java:162)
>         at java.io.BufferedInputStream.read1(BufferedInputStream.java:272)
>         at java.io.BufferedInputStream.read(BufferedInputStream.java:334)
>         at java.io.FilterInputStream.read(FilterInputStream.java:107)
>         at 
> org.apache.spark.util.logging.FileAppender.appendStreamToFile(FileAppender.scala:70)
>         at 
> org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply$mcV$sp(FileAppender.scala:39)
>         at 
> org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply(FileAppender.scala:39)
>         at 
> org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply(FileAppender.scala:39)
>         at 
> org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1772)
>         at 
> org.apache.spark.util.logging.FileAppender$$anon$1.run(FileAppender.scala:38)
> [INFO 2016-01-18 17:12:49 (Logging.scala:59)] Executor 
> app-20160118164728-0250/15 finished with state KILLED exitStatus 143



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to