Repository: spark Updated Branches: refs/heads/branch-2.0 9846a3c4b -> cd870c0c9
[SPARK-20940][CORE] Replace IllegalAccessError with IllegalStateException ## What changes were proposed in this pull request? `IllegalAccessError` is a fatal error (a subclass of LinkageError) and its meaning is `Thrown if an application attempts to access or modify a field, or to call a method that it does not have access to`. Throwing a fatal error for AccumulatorV2 is not necessary and is pretty bad because it usually will just kill executors or SparkContext ([SPARK-20666](https://issues.apache.org/jira/browse/SPARK-20666) is an example of killing SparkContext due to `IllegalAccessError`). I think the correct type of exception in AccumulatorV2 should be `IllegalStateException`. ## How was this patch tested? Jenkins Author: Shixiong Zhu <shixi...@databricks.com> Closes #18168 from zsxwing/SPARK-20940. (cherry picked from commit 24db35826a81960f08e3eb68556b0f51781144e1) Signed-off-by: Shixiong Zhu <shixi...@databricks.com> Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/cd870c0c Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/cd870c0c Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/cd870c0c Branch: refs/heads/branch-2.0 Commit: cd870c0c90d96c01b22bd43e3eed2a12df75e20a Parents: 9846a3c Author: Shixiong Zhu <shixi...@databricks.com> Authored: Wed May 31 17:26:18 2017 -0700 Committer: Shixiong Zhu <shixi...@databricks.com> Committed: Wed May 31 17:26:49 2017 -0700 ---------------------------------------------------------------------- core/src/main/scala/org/apache/spark/util/AccumulatorV2.scala | 4 ++-- core/src/test/scala/org/apache/spark/AccumulatorSuite.scala | 2 +- 2 files changed, 3 insertions(+), 3 deletions(-) ---------------------------------------------------------------------- http://git-wip-us.apache.org/repos/asf/spark/blob/cd870c0c/core/src/main/scala/org/apache/spark/util/AccumulatorV2.scala ---------------------------------------------------------------------- diff --git a/core/src/main/scala/org/apache/spark/util/AccumulatorV2.scala b/core/src/main/scala/org/apache/spark/util/AccumulatorV2.scala index d3ddd39..d06ab3d 100644 --- a/core/src/main/scala/org/apache/spark/util/AccumulatorV2.scala +++ b/core/src/main/scala/org/apache/spark/util/AccumulatorV2.scala @@ -67,7 +67,7 @@ abstract class AccumulatorV2[IN, OUT] extends Serializable { private def assertMetadataNotNull(): Unit = { if (metadata == null) { - throw new IllegalAccessError("The metadata of this accumulator has not been assigned yet.") + throw new IllegalStateException("The metadata of this accumulator has not been assigned yet.") } } @@ -249,7 +249,7 @@ private[spark] object AccumulatorContext { // Since we are storing weak references, we must check whether the underlying data is valid. val acc = ref.get if (acc eq null) { - throw new IllegalAccessError(s"Attempted to access garbage collected accumulator $id") + throw new IllegalStateException(s"Attempted to access garbage collected accumulator $id") } acc } http://git-wip-us.apache.org/repos/asf/spark/blob/cd870c0c/core/src/test/scala/org/apache/spark/AccumulatorSuite.scala ---------------------------------------------------------------------- diff --git a/core/src/test/scala/org/apache/spark/AccumulatorSuite.scala b/core/src/test/scala/org/apache/spark/AccumulatorSuite.scala index 6cbd5ae..1947f7f 100644 --- a/core/src/test/scala/org/apache/spark/AccumulatorSuite.scala +++ b/core/src/test/scala/org/apache/spark/AccumulatorSuite.scala @@ -208,7 +208,7 @@ class AccumulatorSuite extends SparkFunSuite with Matchers with LocalSparkContex assert(ref.get.isEmpty) // Getting a garbage collected accum should throw error - intercept[IllegalAccessError] { + intercept[IllegalStateException] { AccumulatorContext.get(accId) } --------------------------------------------------------------------- To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org