How about system properties? or something like Typesafe Config which
lets you at least override something in a built-in config file on the
command line, with props or other files.

On Mon, Feb 16, 2015 at 3:38 PM, Emre Sevinc <emre.sev...@gmail.com> wrote:
> Sean,
>
> I'm trying this as an alternative to what I currently do. Currently I have
> my module.properties file for my module in the resources directory, and that
> file is put inside the über JAR file when I build my application with Maven,
> and then when I submit it using spark-submit, I can read that
> module.properties file via the traditional method:
>
>
> properties.load(MyModule.class.getClassLoader().getResourceAsStream("module.properties"));
>
> and everything works fine. The disadvantage is that in order to make any
> changes to that .properties file effective, I have to re-build my
> application. Therefore I'm trying to find a way to be able to send that
> module.properties file via spark-submit and read the values in iy, so that I
> will not be forced to build my application every time I want to make a
> change in the module.properties file.
>
> I've also checked the "--files" option of spark-submit, but I see that it is
> for sending the listed files to executors (correct me if I'm wrong), what
> I'm after is being able to pass dynamic properties (key/value pairs) to the
> Driver program of my Spark application. And I still could not find out how
> to do that.
>
> --
> Emre
>
>
>
>
>
> On Mon, Feb 16, 2015 at 4:28 PM, Sean Owen <so...@cloudera.com> wrote:
>>
>> Since SparkConf is only for Spark properties, I think it will in
>> general only pay attention to and preserve "spark.*" properties. You
>> could experiment with that. In general I wouldn't rely on Spark
>> mechanisms for your configuration, and you can use any config
>> mechanism you like to retain your own properties.
>>
>> On Mon, Feb 16, 2015 at 3:26 PM, Emre Sevinc <emre.sev...@gmail.com>
>> wrote:
>> > Hello,
>> >
>> > I'm using Spark 1.2.1 and have a module.properties file, and in it I
>> > have
>> > non-Spark properties, as well as Spark properties, e.g.:
>> >
>> >    job.output.dir=file:///home/emre/data/mymodule/out
>> >
>> > I'm trying to pass it to spark-submit via:
>> >
>> >    spark-submit --class com.myModule --master local[4] --deploy-mode
>> > client
>> > --verbose --properties-file /home/emre/data/mymodule.properties
>> > mymodule.jar
>> >
>> > And I thought I could read the value of my non-Spark property, namely,
>> > job.output.dir by using:
>> >
>> >     SparkConf sparkConf = new SparkConf();
>> >     final String validatedJSONoutputDir =
>> > sparkConf.get("job.output.dir");
>> >
>> > But it gives me an exception:
>> >
>> >     Exception in thread "main" java.util.NoSuchElementException:
>> > job.output.dir
>> >
>> > Is it not possible to mix Spark and non-Spark properties in a single
>> > .properties file, then pass it via --properties-file and then get the
>> > values
>> > of those non-Spark properties via SparkConf?
>> >
>> > Or is there another object / method to retrieve the values for those
>> > non-Spark properties?
>> >
>> >
>> > --
>> > Emre Sevinç
>
>
>
>
> --
> Emre Sevinc

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to