[ 
https://issues.apache.org/jira/browse/SPARK-4489?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14904572#comment-14904572
 ] 

Glenn Strycker commented on SPARK-4489:
---------------------------------------

My ticket SPARK-10762 may have just been a user error, but was interesting 
none-the-less... evidently Scala or Spark is not correctly reporting the type 
of an ArrayBuffer as ArrayBuffer[Any], but claimed it had correctly been cast 
to ArrayBuffer[(Int,String)].

Please see the solution posted on 
http://stackoverflow.com/questions/32727518/genericrowwithschema-exception-in-casting-arraybuffer-to-hashset-in-dataframe-to,
 as it may shed light on this ticket.

For my issue, instead of using
{code}
a(4).asInstanceOf[scala.collection.mutable.ArrayBuffer[(Int,String)]]
{code}

I should be using
{code}
a(4).asInstanceOf[ArrayBuffer[Row]].map{case x:Row => 
(x(0).asInstanceOf[Int],x(1).asInstanceOf[String])}
{code}


> JavaPairRDD.collectAsMap from checkpoint RDD may fail with ClassCastException
> -----------------------------------------------------------------------------
>
>                 Key: SPARK-4489
>                 URL: https://issues.apache.org/jira/browse/SPARK-4489
>             Project: Spark
>          Issue Type: Bug
>          Components: Java API
>    Affects Versions: 1.1.0
>            Reporter: Christopher Ng
>
> Calling collectAsMap() on a JavaPairRDD reconstructed from a checkpoint fails 
> with a ClassCastException:
> Exception in thread "main" java.lang.ClassCastException: [Ljava.lang.Object; 
> cannot be cast to [Lscala.Tuple2;
>       at 
> org.apache.spark.rdd.PairRDDFunctions.collectAsMap(PairRDDFunctions.scala:595)
>       at 
> org.apache.spark.api.java.JavaPairRDD.collectAsMap(JavaPairRDD.scala:569)
>       at org.facboy.spark.CheckpointBug.main(CheckpointBug.java:46)
> Code sample reproducing the issue: 
> https://gist.github.com/facboy/8387e950ffb0746a8272



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to