[ https://issues.apache.org/jira/browse/SPARK-3539?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean Owen resolved SPARK-3539. ------------------------------ Resolution: Fixed Fix Version/s: 1.1.1 1.2.0 I'm all but certain this was fixed by SPARK-1853 and https://github.com/apache/spark/commit/729952a5efce755387c76cdf29280ee6f49fdb72 The problem here is that the "scala.Option..." line is parsed as user code but it isn't of course. It should keep skipping these frames. User code is higher up. That was part of the fix that was made, to skip things starting with "scala". > Task description "apply at Option.scala:120"; no user code involved > ------------------------------------------------------------------- > > Key: SPARK-3539 > URL: https://issues.apache.org/jira/browse/SPARK-3539 > Project: Spark > Issue Type: Bug > Components: Spark Core > Affects Versions: 1.1.0 > Reporter: John Salvatier > Priority: Minor > Fix For: 1.2.0, 1.1.1 > > > In Spark 1.1, I'm seeing tasks with callbacks that don't involve my code at > all! > I'd seen something like this before in 1.0.0, but the behavior seems to be > back > apply at Option.scala:120 > org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:202) > scala.Option.getOrElse(Option.scala:120) > org.apache.spark.rdd.RDD.partitions(RDD.scala:202) > org.apache.spark.rdd.FilteredRDD.getPartitions(FilteredRDD.scala:29) > org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:204) > org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:202) > scala.Option.getOrElse(Option.scala:120) > org.apache.spark.rdd.RDD.partitions(RDD.scala:202) > org.apache.spark.rdd.MappedRDD.getPartitions(MappedRDD.scala:28) > org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:204) > org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:202) > scala.Option.getOrElse(Option.scala:120) > org.apache.spark.rdd.RDD.partitions(RDD.scala:202) > org.apache.spark.rdd.FilteredRDD.getPartitions(FilteredRDD.scala:29) > org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:204) > org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:202) > scala.Option.getOrElse(Option.scala:120) > org.apache.spark.rdd.RDD.partitions(RDD.scala:202) > org.apache.spark.rdd.MappedRDD.getPartitions(MappedRDD.scala:28) > org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:204) -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org