[ 
https://issues.apache.org/jira/browse/SPARK-13055?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-13055:
------------------------------------

    Assignee: Andrew Or  (was: Apache Spark)

> SQLHistoryListener throws ClassCastException
> --------------------------------------------
>
>                 Key: SPARK-13055
>                 URL: https://issues.apache.org/jira/browse/SPARK-13055
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.5.0
>            Reporter: Andrew Or
>            Assignee: Andrew Or
>
> {code}
> 16/01/27 18:46:28 ERROR ReplayListenerBus: Listener SQLHistoryListener threw 
> an exception
> java.lang.ClassCastException: java.lang.Integer cannot be cast to 
> java.lang.Long
>         at scala.runtime.BoxesRunTime.unboxToLong(BoxesRunTime.java:110)
>         at 
> org.apache.spark.sql.execution.ui.SQLHistoryListener$$anonfun$onTaskEnd$1$$anonfun$5.apply(SQLListener.scala:334)
>         at 
> org.apache.spark.sql.execution.ui.SQLHistoryListener$$anonfun$onTaskEnd$1$$anonfun$5.apply(SQLListener.scala:334)
>         at scala.Option.map(Option.scala:145)
>         at 
> org.apache.spark.sql.execution.ui.SQLHistoryListener$$anonfun$onTaskEnd$1.apply(SQLListener.scala:334)
>         at 
> org.apache.spark.sql.execution.ui.SQLHistoryListener$$anonfun$onTaskEnd$1.apply(SQLListener.scala:332)
> {code}
> SQLHistoryListener listens on SparkListenerTaskEnd events, which contain 
> non-SQL accumulators as well. We try to cast all accumulators we encounter to 
> Long, resulting in an error like this one.
> Note: this was a problem even before internal accumulators were introduced. 
> If  the task used a user accumulator of a type other than Long, we would 
> still see this.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to