Repository: spark
Updated Branches:
  refs/heads/master ccba622e3 -> 54e61df26


[SPARK-16599][CORE] java.util.NoSuchElementException: None.get at at 
org.apache.spark.storage.BlockInfoManager.releaseAllLocksForTask

## What changes were proposed in this pull request?

Avoid None.get exception in (rare?) case that no readLocks exist
Note that while this would resolve the immediate cause of the exception, it's 
not clear it is the root problem.

## How was this patch tested?

Existing tests

Author: Sean Owen <so...@cloudera.com>

Closes #17290 from srowen/SPARK-16599.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/54e61df2
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/54e61df2
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/54e61df2

Branch: refs/heads/master
Commit: 54e61df2634163382c7d01a2ad40ffb5e7270abc
Parents: ccba622
Author: Sean Owen <so...@cloudera.com>
Authored: Sat Mar 18 18:01:24 2017 +0100
Committer: Sean Owen <so...@cloudera.com>
Committed: Sat Mar 18 18:01:24 2017 +0100

----------------------------------------------------------------------
 .../main/scala/org/apache/spark/storage/BlockInfoManager.scala   | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/54e61df2/core/src/main/scala/org/apache/spark/storage/BlockInfoManager.scala
----------------------------------------------------------------------
diff --git 
a/core/src/main/scala/org/apache/spark/storage/BlockInfoManager.scala 
b/core/src/main/scala/org/apache/spark/storage/BlockInfoManager.scala
index dd8f5ba..490d45d 100644
--- a/core/src/main/scala/org/apache/spark/storage/BlockInfoManager.scala
+++ b/core/src/main/scala/org/apache/spark/storage/BlockInfoManager.scala
@@ -23,7 +23,7 @@ import scala.collection.JavaConverters._
 import scala.collection.mutable
 import scala.reflect.ClassTag
 
-import com.google.common.collect.ConcurrentHashMultiset
+import com.google.common.collect.{ConcurrentHashMultiset, ImmutableMultiset}
 
 import org.apache.spark.{SparkException, TaskContext}
 import org.apache.spark.internal.Logging
@@ -340,7 +340,7 @@ private[storage] class BlockInfoManager extends Logging {
     val blocksWithReleasedLocks = mutable.ArrayBuffer[BlockId]()
 
     val readLocks = synchronized {
-      readLocksByTask.remove(taskAttemptId).get
+      
readLocksByTask.remove(taskAttemptId).getOrElse(ImmutableMultiset.of[BlockId]())
     }
     val writeLocks = synchronized {
       writeLocksByTask.remove(taskAttemptId).getOrElse(Seq.empty)


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to