I haven't heard any reports of this yet, but I don't see any reason why it
wouldn't work. You'll need to manually convert the objects that come out of
the sequence file into something where SparkSQL can detect the schema (i.e.
scala case classes or java beans) before you can register the RDD as a
: Mon, 7 Jul 2014 17:12:42 -0700
Subject: Re: SparkSQL with sequence file RDDs
To: user@spark.apache.org
I haven't heard any reports of this yet, but I don't see any reason why it
wouldn't work. You'll need to manually convert the objects that come out of the
sequence file into something where
We know Scala 2.11 has remove the limitation of parameter number, but
Spark 1.0 is not compatible with it. So now we are considering use java
beans instead of Scala case classes.
You can also manually create a class that implements scala's Product
interface. Finally, SPARK-2179
...@databricks.com
Date: Mon, 7 Jul 2014 17:52:34 -0700
Subject: Re: SparkSQL with sequence file RDDs
To: user@spark.apache.org
We know Scala 2.11 has remove the limitation of parameter number, but Spark 1.0
is not compatible with it. So now we are considering use java beans instead of
Scala case
--
From: mich...@databricks.com
Date: Mon, 7 Jul 2014 17:52:34 -0700
Subject: Re: SparkSQL with sequence file RDDs
To: user@spark.apache.org
We know Scala 2.11 has remove the limitation of parameter number, but
Spark 1.0 is not compatible with it. So now we