[jira] [Updated] (SPARK-4514) SparkContext localProperties does not inherit property updates across thread reuse
[ https://issues.apache.org/jira/browse/SPARK-4514?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Andrew Or updated SPARK-4514: - Assignee: Richard W. Eggert II (was: Josh Rosen) > SparkContext localProperties does not inherit property updates across thread > reuse > -- > > Key: SPARK-4514 > URL: https://issues.apache.org/jira/browse/SPARK-4514 > Project: Spark > Issue Type: Bug > Components: Spark Core >Affects Versions: 1.1.0, 1.1.1, 1.2.0 >Reporter: Erik Erlandson >Assignee: Richard W. Eggert II >Priority: Critical > Fix For: 2.0.0 > > > The current job group id of a Spark context is stored in the > {{localProperties}} member value. This data structure is designed to be > thread local, and its settings are not preserved when {{ComplexFutureAction}} > instantiates a new {{Future}}. > One consequence of this is that {{takeAsync()}} does not behave in the same > way as other async actions, e.g. {{countAsync()}}. For example, this test > (if copied into StatusTrackerSuite.scala), will fail, because > {{"my-job-group2"}} is not propagated to the Future which actually > instantiates the job: > {code:java} > test("getJobIdsForGroup() with takeAsync()") { > sc = new SparkContext("local", "test", new SparkConf(false)) > sc.setJobGroup("my-job-group2", "description") > sc.statusTracker.getJobIdsForGroup("my-job-group2") should be (Seq.empty) > val firstJobFuture = sc.parallelize(1 to 1000, 1).takeAsync(1) > val firstJobId = eventually(timeout(10 seconds)) { > firstJobFuture.jobIds.head > } > eventually(timeout(10 seconds)) { > sc.statusTracker.getJobIdsForGroup("my-job-group2") should be > (Seq(firstJobId)) > } > } > {code} > It also impacts current PR for SPARK-1021, which involves additional uses of > {{ComplexFutureAction}}. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-4514) SparkContext localProperties does not inherit property updates across thread reuse
[ https://issues.apache.org/jira/browse/SPARK-4514?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Erik Erlandson updated SPARK-4514: -- Summary: SparkContext localProperties does not inherit property updates across thread reuse (was: ComplexFutureAction does not preserve job group IDs) SparkContext localProperties does not inherit property updates across thread reuse -- Key: SPARK-4514 URL: https://issues.apache.org/jira/browse/SPARK-4514 Project: Spark Issue Type: Bug Components: Spark Core Affects Versions: 1.1.0, 1.1.1, 1.2.0 Reporter: Erik Erlandson Assignee: Josh Rosen Priority: Critical The current job group id of a Spark context is stored in the {{localProperties}} member value. This data structure is designed to be thread local, and its settings are not preserved when {{ComplexFutureAction}} instantiates a new {{Future}}. One consequence of this is that {{takeAsync()}} does not behave in the same way as other async actions, e.g. {{countAsync()}}. For example, this test (if copied into StatusTrackerSuite.scala), will fail, because {{my-job-group2}} is not propagated to the Future which actually instantiates the job: {code:java} test(getJobIdsForGroup() with takeAsync()) { sc = new SparkContext(local, test, new SparkConf(false)) sc.setJobGroup(my-job-group2, description) sc.statusTracker.getJobIdsForGroup(my-job-group2) should be (Seq.empty) val firstJobFuture = sc.parallelize(1 to 1000, 1).takeAsync(1) val firstJobId = eventually(timeout(10 seconds)) { firstJobFuture.jobIds.head } eventually(timeout(10 seconds)) { sc.statusTracker.getJobIdsForGroup(my-job-group2) should be (Seq(firstJobId)) } } {code} It also impacts current PR for SPARK-1021, which involves additional uses of {{ComplexFutureAction}}. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org