After using Spark 1.2 for quite a long time, I have realised that you can
no longer pass spark configuration to the driver via the --conf via command
line (or in my case shell script).

I am thinking about using system properties and picking the config up using
the following bit of code:

    def getConfigOption(conf: SparkConf, name: String)
        conf getOption name orElse sys.props.get(name)

How do i pass a config.file option and string version of the date specified
as a start time to a spark-submit command?

I have attempted using the following in my start up shell script:
    other Config options \
    --conf
"spark.executor.extraJavaOptions=-Dconfig.file=../conf/mifid.conf
-DstartTime=2016-06-04 00:00:00" \

but this fails at the space in the date splits the command up.

Any idea how to do this successfully, or has anyone got any advice on this
one?

Thanks

K

Reply via email to