Re: [Spark 2.0] Why MutableInt cannot be cast to MutableLong?

2016-07-31 Thread Chanh Le
If I have a column store in a parquet file under INT type and I create a table with the same column but change the time from int to bigint. in Spark 2.0 it shows error: Error: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 259.0 failed 4 times, most recent

Re: [Spark 2.0] Why MutableInt cannot be cast to MutableLong?

2016-07-31 Thread Chanh Le
Sorry my bad, I ran in Spark 1.6.1 but what about this error? Why Int cannot be cast to Long? Thanks. > On Aug 1, 2016, at 2:44 AM, Michael Armbrust wrote: > > Are you sure you are running Spark 2.0? > > In your stack trace I see SqlNewHadoopRDD, which was removed in

Re: [Spark 2.0] Why MutableInt cannot be cast to MutableLong?

2016-07-31 Thread Michael Armbrust
Are you sure you are running Spark 2.0? In your stack trace I see SqlNewHadoopRDD, which was removed in #12354 . On Sun, Jul 31, 2016 at 2:12 AM, Chanh Le wrote: > Hi everyone, > Why *MutableInt* cannot be cast to *MutableLong?*

[Spark 2.0] Why MutableInt cannot be cast to MutableLong?

2016-07-31 Thread Chanh Le
Hi everyone, Why MutableInt cannot be cast to MutableLong? It’s really weird and seems Spark 2.0 has a lot of error with parquet about format. org.apache.spark.sql.catalyst.expressions.MutableInt cannot be cast to org.apache.spark.sql.catalyst.expressions.MutableL ong Caused by: