[ 
https://issues.apache.org/jira/browse/SPARK-3539?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14135856#comment-14135856
 ] 

John Salvatier commented on SPARK-3539:
---------------------------------------

Sorry, in the Spark UI, a task appears with the description "apply at 
Option.scala:120" and the stack trace posted above if you click "details".

> Task description "apply at Option.scala:120"; no user code involved
> -------------------------------------------------------------------
>
>                 Key: SPARK-3539
>                 URL: https://issues.apache.org/jira/browse/SPARK-3539
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.1.0
>            Reporter: John Salvatier
>            Priority: Minor
>
> In Spark 1.1, I'm seeing tasks with callbacks that don't involve my code at 
> all!
> I'd seen something like this before in 1.0.0, but the behavior seems to be 
> back
> apply at Option.scala:120
> org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:202)
> scala.Option.getOrElse(Option.scala:120)
> org.apache.spark.rdd.RDD.partitions(RDD.scala:202)
> org.apache.spark.rdd.FilteredRDD.getPartitions(FilteredRDD.scala:29)
> org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:204)
> org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:202)
> scala.Option.getOrElse(Option.scala:120)
> org.apache.spark.rdd.RDD.partitions(RDD.scala:202)
> org.apache.spark.rdd.MappedRDD.getPartitions(MappedRDD.scala:28)
> org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:204)
> org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:202)
> scala.Option.getOrElse(Option.scala:120)
> org.apache.spark.rdd.RDD.partitions(RDD.scala:202)
> org.apache.spark.rdd.FilteredRDD.getPartitions(FilteredRDD.scala:29)
> org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:204)
> org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:202)
> scala.Option.getOrElse(Option.scala:120)
> org.apache.spark.rdd.RDD.partitions(RDD.scala:202)
> org.apache.spark.rdd.MappedRDD.getPartitions(MappedRDD.scala:28)
> org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:204)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to