Since SparkConf is only for Spark properties, I think it will in
general only pay attention to and preserve "spark.*" properties. You
could experiment with that. In general I wouldn't rely on Spark
mechanisms for your configuration, and you can use any config
mechanism you like to retain your own properties.

On Mon, Feb 16, 2015 at 3:26 PM, Emre Sevinc <emre.sev...@gmail.com> wrote:
> Hello,
>
> I'm using Spark 1.2.1 and have a module.properties file, and in it I have
> non-Spark properties, as well as Spark properties, e.g.:
>
>    job.output.dir=file:///home/emre/data/mymodule/out
>
> I'm trying to pass it to spark-submit via:
>
>    spark-submit --class com.myModule --master local[4] --deploy-mode client
> --verbose --properties-file /home/emre/data/mymodule.properties mymodule.jar
>
> And I thought I could read the value of my non-Spark property, namely,
> job.output.dir by using:
>
>     SparkConf sparkConf = new SparkConf();
>     final String validatedJSONoutputDir = sparkConf.get("job.output.dir");
>
> But it gives me an exception:
>
>     Exception in thread "main" java.util.NoSuchElementException:
> job.output.dir
>
> Is it not possible to mix Spark and non-Spark properties in a single
> .properties file, then pass it via --properties-file and then get the values
> of those non-Spark properties via SparkConf?
>
> Or is there another object / method to retrieve the values for those
> non-Spark properties?
>
>
> --
> Emre Sevinç

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to