UPDATES:
It happens only when I use 'case class' and map RDD to this class in
spark-shell.
The other RDD transform, SchemaRDD with parquet file and any SparkSQL
operation work fine.
Is there some changes related to case class operation between 1.0.0 and
1.0.1?
Best regards
Kevin
--
View this m
I'm getting similar errors on spark streaming -- but at this point in my
project I don't need a cluster and can develop locally. Will write it up
later, though, if it persists.
On Tue, Jul 15, 2014 at 7:44 PM, Kevin Jung wrote:
> Hi,
> I recently upgrade my spark 1.0.0 cluster to 1.0.1.
> But
Hi,
I recently upgrade my spark 1.0.0 cluster to 1.0.1.
But it gives me "ERROR remote.EndpointWriter: AssociationError" when I run
simple SparkSQL job in spark-shell.
here is code,
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
import sqlContext._
case class Person(name:String, Age:Int,