Hi,
I recently upgrade my spark 1.0.0 cluster to 1.0.1.
But it gives me ERROR remote.EndpointWriter: AssociationError when I run
simple SparkSQL job in spark-shell.
here is code,
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
import sqlContext._
case class Person(name:String, Age:Int,
UPDATES:
It happens only when I use 'case class' and map RDD to this class in
spark-shell.
The other RDD transform, SchemaRDD with parquet file and any SparkSQL
operation work fine.
Is there some changes related to case class operation between 1.0.0 and
1.0.1?
Best regards
Kevin
--
View this