Hello,
Is there any reason in not publishing spark repl in the version 1.2.0?
In repl/pom.xml the deploy and publish are been skipped.
Regards,
Dirceu
It was not intended to be a public API but there is a request to keep
publishing it as a developer API:
https://issues.apache.org/jira/browse/SPARK-4923
On Dec 26, 2014 2:09 PM, Dirceu Semighini Filho
dirceu.semigh...@gmail.com wrote:
Hello,
Is there any reason in not publishing spark repl in
Do we have access to the SQL specification (say, SQL-92) for reference
during Spark SQL development? I know it's not freely available on the web.
Usually, you can only access drafts.
I know that, generally, we look to other systems (especially Hive) when
figuring out how something in Spark SQL
How, O how can this be? Doesn't the SQLContext hold a reference to the
SparkContext?
Alex
I am building spark with sbt off of branch 1.2. I'm using the following
command:
sbt/sbt -Pyarn -Phadoop-2.3 assembly
(http://spark.apache.org/docs/latest/building-spark.html#building-with-sbt)
Although the jar file I obtain does contain the proper version of the
hadoop libraries (v. 2.4), the
Can you try this command ?
sbt/sbt -Pyarn -Phadoop-2.4 -Dhadoop.version=2.6.0 -Phive assembly
On Fri, Dec 26, 2014 at 6:15 PM, Alessandro Baretta alexbare...@gmail.com
wrote:
I am building spark with sbt off of branch 1.2. I'm using the following
command:
sbt/sbt -Pyarn -Phadoop-2.3
The spark context reference is transient.
On Fri, Dec 26, 2014 at 6:11 PM, Alessandro Baretta alexbare...@gmail.com
wrote:
How, O how can this be? Doesn't the SQLContext hold a reference to the
SparkContext?
Alex
Here's what I get:
./assembly/target/scala-2.10/spark-assembly-1.3.0-SNAPSHOT-hadoop2.6.0.jar
Alex
On Fri, Dec 26, 2014 at 8:41 PM, Ted Yu yuzhih...@gmail.com wrote:
Can you try this command ?
sbt/sbt -Pyarn -Phadoop-2.4 -Dhadoop.version=2.6.0 -Phive assembly
On Fri, Dec 26, 2014 at 6:15
Michael,
I'm having trouble storing my SchemaRDDs in Parquet format with SparkSQL,
due to my RDDs having having DateType and DecimalType fields. What would it
take to add Parquet support for these Catalyst? Are there any other
Catalyst types for which there is no Catalyst support?
Alex
Hi Alessandro,
It's fixed by SPARK-3787 and will be applied to 1.2.1 and 1.3.0.
https://issues.apache.org/jira/browse/SPARK-3787
- Kousuke
(2014/12/27 11:15), Alessandro Baretta wrote:
I am building spark with sbt off of branch 1.2. I'm using the following
command:
sbt/sbt -Pyarn
10 matches
Mail list logo