"[D" type means a double array type. So this error simple means you have 
double[] data, but Spark needs to cast it to Double, as your schema defined.


The error message clearly indicates the data doesn't match with  the type 
specified in the schema.


I wonder how you are so sure about your data? Do you check it under other tool?


Yong


________________________________
From: Richard Xin <richardxin...@yahoo.com.INVALID>
Sent: Saturday, December 17, 2016 10:56 AM
To: zjp_j...@163.com; user
Subject: Re: Java to show struct field from a Dataframe

data is good


On Saturday, December 17, 2016 11:50 PM, "zjp_j...@163.com" <zjp_j...@163.com> 
wrote:


I think the causation is your invanlid Double data , have u checked your data ?

________________________________
zjp_j...@163.com

From: Richard Xin<mailto:richardxin...@yahoo.com.INVALID>
Date: 2016-12-17 23:28
To: User<mailto:user@spark.apache.org>
Subject: Java to show struct field from a Dataframe
let's say I have a DataFrame with schema of followings:
root
 |-- name: string (nullable = true)
 |-- location: struct (nullable = true)
 |    |-- longitude: double (nullable = true)
 |    |-- latitude: double (nullable = true)

df.show(); throws following exception:

java.lang.ClassCastException: [D cannot be cast to java.lang.Double
    at scala.runtime.BoxesRunTime.unboxToDouble(BoxesRunTime.java:119)
    at 
org.apache.spark.sql.catalyst.expressions.BaseGenericInternalRow$class.getDouble(rows.scala:44)
    at 
org.apache.spark.sql.catalyst.expressions.GenericInternalRow.getDouble(rows.scala:221)
    at 
org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.apply(Unknown
 Source)
....

Any advise?
Thanks in advance.
Richard


Reply via email to