Re: SparkSQL with sequence file RDDs

2014-07-07 Thread Michael Armbrust
I haven't heard any reports of this yet, but I don't see any reason why it wouldn't work. You'll need to manually convert the objects that come out of the sequence file into something where SparkSQL can detect the schema (i.e. scala case classes or java beans) before you can register the RDD as a

RE: SparkSQL with sequence file RDDs

2014-07-07 Thread Haoming Zhang
: Mon, 7 Jul 2014 17:12:42 -0700 Subject: Re: SparkSQL with sequence file RDDs To: user@spark.apache.org I haven't heard any reports of this yet, but I don't see any reason why it wouldn't work. You'll need to manually convert the objects that come out of the sequence file into something where

Re: SparkSQL with sequence file RDDs

2014-07-07 Thread Michael Armbrust
We know Scala 2.11 has remove the limitation of parameter number, but Spark 1.0 is not compatible with it. So now we are considering use java beans instead of Scala case classes. You can also manually create a class that implements scala's Product interface. Finally, SPARK-2179

RE: SparkSQL with sequence file RDDs

2014-07-07 Thread Haoming Zhang
...@databricks.com Date: Mon, 7 Jul 2014 17:52:34 -0700 Subject: Re: SparkSQL with sequence file RDDs To: user@spark.apache.org We know Scala 2.11 has remove the limitation of parameter number, but Spark 1.0 is not compatible with it. So now we are considering use java beans instead of Scala case

Re: SparkSQL with sequence file RDDs

2014-07-07 Thread Michael Armbrust
-- From: mich...@databricks.com Date: Mon, 7 Jul 2014 17:52:34 -0700 Subject: Re: SparkSQL with sequence file RDDs To: user@spark.apache.org We know Scala 2.11 has remove the limitation of parameter number, but Spark 1.0 is not compatible with it. So now we