SparkSQL HiveContext TypeTag compile error

2014-09-11 Thread Du Li
Hi,

I have the following code snippet. It works fine on spark-shell but in a 
standalone app it reports No TypeTag available for MySchema” at compile time 
when calling hc.createScheamaRdd(rdd). Anybody knows what might be missing?

Thanks,
Du

--
Import org.apache.spark.sql.hive.HiveContext

case class MySchema(key: Int, value: String)
val rdd = sc.parallelize((1 to 10).map(i = MySchema(i, sval$i)))
val schemaRDD = hc.createSchemaRDD(rdd)
schemaRDD.registerTempTable(data)
val rows = hc.sql(select * from data)
rows.collect.foreach(println)


Re: SparkSQL HiveContext TypeTag compile error

2014-09-11 Thread Du Li
Solved it.

The problem occurred because the case class was defined within a test case in 
FunSuite. Moving the case class definition out of test fixed the problem.


From: Du Li l...@yahoo-inc.com.INVALIDmailto:l...@yahoo-inc.com.INVALID
Date: Thursday, September 11, 2014 at 11:25 AM
To: user@spark.apache.orgmailto:user@spark.apache.org 
user@spark.apache.orgmailto:user@spark.apache.org
Subject: SparkSQL HiveContext TypeTag compile error

Hi,

I have the following code snippet. It works fine on spark-shell but in a 
standalone app it reports No TypeTag available for MySchema” at compile time 
when calling hc.createScheamaRdd(rdd). Anybody knows what might be missing?

Thanks,
Du

--
Import org.apache.spark.sql.hive.HiveContext

case class MySchema(key: Int, value: String)
val rdd = sc.parallelize((1 to 10).map(i = MySchema(i, sval$i)))
val schemaRDD = hc.createSchemaRDD(rdd)
schemaRDD.registerTempTable(data)
val rows = hc.sql(select * from data)
rows.collect.foreach(println)


Re: SparkSQL HiveContext TypeTag compile error

2014-09-11 Thread Du Li

Just moving it out of test is not enough. Must move the case class definition 
to the top level. Otherwise it would report a runtime error of  task not 
serializable when executing collect().


From: Du Li l...@yahoo-inc.com.INVALIDmailto:l...@yahoo-inc.com.INVALID
Date: Thursday, September 11, 2014 at 12:33 PM
To: user@spark.apache.orgmailto:user@spark.apache.org 
user@spark.apache.orgmailto:user@spark.apache.org
Subject: Re: SparkSQL HiveContext TypeTag compile error

Solved it.

The problem occurred because the case class was defined within a test case in 
FunSuite. Moving the case class definition out of test fixed the problem.


From: Du Li l...@yahoo-inc.com.INVALIDmailto:l...@yahoo-inc.com.INVALID
Date: Thursday, September 11, 2014 at 11:25 AM
To: user@spark.apache.orgmailto:user@spark.apache.org 
user@spark.apache.orgmailto:user@spark.apache.org
Subject: SparkSQL HiveContext TypeTag compile error

Hi,

I have the following code snippet. It works fine on spark-shell but in a 
standalone app it reports No TypeTag available for MySchema” at compile time 
when calling hc.createScheamaRdd(rdd). Anybody knows what might be missing?

Thanks,
Du

--
Import org.apache.spark.sql.hive.HiveContext

case class MySchema(key: Int, value: String)
val rdd = sc.parallelize((1 to 10).map(i = MySchema(i, sval$i)))
val schemaRDD = hc.createSchemaRDD(rdd)
schemaRDD.registerTempTable(data)
val rows = hc.sql(select * from data)
rows.collect.foreach(println)