[ https://issues.apache.org/jira/browse/SPARK-18297?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15643795#comment-15643795 ]
liuhongqiang edited comment on SPARK-18297 at 11/7/16 10:41 AM: ---------------------------------------------------------------- I use a shared singleton SparkContext in product environment. The issue is SparkContext can't finish crrectly in asynchronous thread. was (Author: smallyard): I use a shared singleton SparkContext in product environment. > Fail if SparkContext run a new Thread in yarn-cluster > ----------------------------------------------------- > > Key: SPARK-18297 > URL: https://issues.apache.org/jira/browse/SPARK-18297 > Project: Spark > Issue Type: Bug > Components: YARN > Reporter: liuhongqiang > > program: > public static void main(String[] args) { > Executors.newSingleThreadScheduledExecutor().scheduleAtFixedRate(new > Thread(new Runnable() { > @Override > public void run() { > SparkConf conf = new SparkConf(); > conf.setAppName("SparkDemo"); > JavaSparkContext sparkContext = new JavaSparkContext(conf); > JavaRDD<String> array = > sparkContext.parallelize(Lists.newArrayList("1", "2", "3", "4")); > System.out.println(array.count()); > } > }), 0, 5000, TimeUnit.MILLISECONDS); > } > log: > 16/11/02 11:16:47 INFO yarn.ApplicationMaster: Starting the user application > in a separate Thread > 16/11/02 11:16:47 INFO yarn.ApplicationMaster: Waiting for spark context > initialization > 16/11/02 11:16:47 INFO yarn.ApplicationMaster: Waiting for spark context > initialization ... > 16/11/02 11:16:47 INFO yarn.ApplicationMaster: Final app status: SUCCEEDED, > exitCode: 0 > problem: > mainMethod.invoke(null, userArgs.toArray) > finish(FinalApplicationStatus.SUCCEEDED, ApplicationMaster.EXIT_SUCCESS) > main method was finished, but sub thread may be not finished. > so should not invoke finish to shutdown dirver thread. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org