Accidentally to get it working, though don't thoroughly understand why (So far 
as I know, it's to configure in allowing executor refers to the conf file after 
copying to executors' working dir). Basically it's a combination of parameters 
--conf, --files, and --driver-class-path, instead of any single parameter.

spark-submit --class pkg.to.MyApp --master local[*] --conf 
"spark.executor.extraClassPath=-Dconfig.file=<myfile.conf>" --files 
<conf/myfile.conf> --driver-class-path "</absolute/path/to/conf/dir>"

--conf requires to pass the conf file name e.g. myfile.conf along with spark 
executor class path as directive.

--files passes the conf file associated from the context root e.g. executing 
under dir <my-project>, under which it contains folders such as conf, logs, 
work and so on. The conf file i.e. myfile.conf is located under conf folder.

--driver-class-path points to the conf directory with absolute path.


‐‐‐‐‐‐‐ Original Message ‐‐‐‐‐‐‐
On August 17, 2018 3:00 AM, yujhe.li <liyu...@gmail.com> wrote:

> So can you read the file on executor side?
> I think the file passed by --files my.app.conf would be added under
> classpath, and you can use it directly.
>
>
> --------------------------------------------------------------------------------------------------------------------------------------------------------
>
> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>
> ----------------------------------------------------------------
>
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org



---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to