Yeah, same issue. I noticed this issue is not solved yet. 

----- 原始邮件 -----
发件人:Ted Yu <yuzhih...@gmail.com>
收件人:doovs...@sina.com
抄送人:user <user@spark.apache.org>
主题:Re: Spark SQL 1.3.1: java.lang.ClassCastException is thrown
日期:2015年04月25日 22点04分

Looks like this is related:
https://issues.apache.org/jira/browse/SPARK-5456

On Sat, Apr 25, 2015 at 6:59 AM,  <doovs...@sina.com> wrote:
Hi all,

When I query Postgresql based on Spark SQL like this:

      dataFrame.registerTempTable("Employees")

      val emps = sqlContext.sql("select name, sum(salary) from Employees group 
by name, salary")

      monitor {

        emps.take(10)

          .map(row => (row.getString(0), row.getDecimal(1)))

          .foreach(println)

      }



The type of salary column in data table is numeric(10, 2).



It throws the following exception:

Exception in thread "main" org.apache.spark.SparkException: Job aborted due to 
stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost 
task 0.0 in stage 0.0 (TID 0, localhost): java.lang.ClassCastException: 
java.math.BigDecimal cannot be cast to org.apache.spark.sql.types.Decimal



Who know this issue and how to solve it? Thanks.



Regards,

Yi

---------------------------------------------------------------------

To unsubscribe, e-mail: user-unsubscr...@spark.apache.org

For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to