replacement for SPARK_JAVA_OPTS

2014-07-30 Thread Cody Koeninger
We were previously using SPARK_JAVA_OPTS to set java system properties via -D. This was used for properties that varied on a per-deployment-environment basis, but needed to be available in the spark shell and workers. On upgrading to 1.0, we saw that SPARK_JAVA_OPTS had been deprecated, and

Re: replacement for SPARK_JAVA_OPTS

2014-07-30 Thread Marcelo Vanzin
Hi Cody, Could you file a bug for this if there isn't one already? For system properties SparkSubmit should be able to read those settings and do the right thing, but that obviously won't work for other JVM options... the current code should work fine in cluster mode though, since the driver is

Re: replacement for SPARK_JAVA_OPTS

2014-07-30 Thread Patrick Wendell
Cody - in your example you are using the '=' character, but in our documentation and tests we use a whitespace to separate the key and value in the defaults file. docs: http://spark.apache.org/docs/latest/configuration.html spark.driver.extraJavaOptions -Dfoo.bar.baz=23 I'm not sure if the java

spark 0.9.0 with hadoop 2.4 ?

2014-07-30 Thread yao
Hi Everyone, We got some yarn related errors when running spark 0.9.0 on hadoop 2.4 (but it was okay on hadoop 2.2). I didn't find any comments said spark 0.9.0 could support hadoop 2.4, so could I assume that we have to upgrade spark to the latest release version at this point to solve this

Re: replacement for SPARK_JAVA_OPTS

2014-07-30 Thread Cody Koeninger
Either whitespace or equals sign are valid properties file formats. Here's an example: $ cat conf/spark-defaults.conf spark.driver.extraJavaOptions -Dfoo.bar.baz=23 $ ./bin/spark-shell -v Using properties file: /opt/spark/conf/spark-defaults.conf Adding default property:

Re: replacement for SPARK_JAVA_OPTS

2014-07-30 Thread Cody Koeninger
In addition, spark.executor.extraJavaOptions does not seem to behave as I would expect; java arguments don't seem to be propagated to executors. $ cat conf/spark-defaults.conf spark.master mesos://zk://etl-01.mxstg:2181,etl-02.mxstg:2181,etl-03.mxstg:2181/masters spark.executor.extraJavaOptions

Re: spark 0.9.0 with hadoop 2.4 ?

2014-07-30 Thread yao
I think I might find the root cause, YARN-1931 addressed the incompatible issue. The solution for my case might be either take related Spark patches or do an upgrade. On Wed, Jul 30, 2014 at 2:11 PM, yao yaosheng...@gmail.com wrote: Hi Everyone, We got some yarn related errors when running

subscribe dev list for spark

2014-07-30 Thread Grace

Re: subscribe dev list for spark

2014-07-30 Thread Ted Yu
See Mailing list section of: https://spark.apache.org/community.html On Wed, Jul 30, 2014 at 6:53 PM, Grace syso...@gmail.com wrote:

failed to build spark with maven for both 1.0.1 and latest master branch

2014-07-30 Thread yao
Hi Folks, Today I am trying to build spark using maven; however, the following command failed consistently for both 1.0.1 and the latest master. (BTW, it seems sbt works fine: *sbt/sbt -Dhadoop.version=2.4.0 -Pyarn clean assembly)* Environment: Mac OS Mavericks Maven: 3.2.2 (installed by