Thanks, Armbrust.
I'm using 2.0.0.
Does 2.0.1 stable version fix it?

On Thu, Nov 3, 2016 at 2:01 AM, Michael Armbrust <mich...@databricks.com>
wrote:

> Thats a bug.  Which version of Spark are you running?  Have you tried
> 2.0.2?
>
> On Wed, Nov 2, 2016 at 12:01 AM, 颜发才(Yan Facai) <yaf...@gmail.com> wrote:
>
>> Hi, all.
>> When I use a case class as return value in map function, spark always
>> raise a ClassCastException.
>>
>> I write an demo, like:
>>
>> scala> case class Record(key: Int, value: String)
>>
>> scala> case class ID(key: Int)
>>
>> scala> val df = Seq(Record(1, "a"), Record(2, "b")).toDF
>>
>> scala> df.map{x => ID(x.getInt(0))}.show
>>
>> 16/11/02 14:52:34 ERROR Executor: Exception in task 0.0 in stage 166.0
>> (TID 175)
>> java.lang.ClassCastException: $line1401.$read$$iw$$iw$ID cannot be cast
>> to $line1401.$read$$iw$$iw$ID
>>         at org.apache.spark.sql.catalyst.expressions.GeneratedClass$Gen
>> eratedIterator.processNext(Unknown Source)
>>
>>
>> Please tell me if I'm wrong.
>> Thanks.
>>
>>
>
>

Reply via email to