[ 
https://issues.apache.org/jira/browse/SPARK-8622?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Baswaraj updated SPARK-8622:
----------------------------
    Description: 
I ran into an issue that executor not able to pickup my configs/ function from 
my custom jar in standalone (client/cluster) deploy mode. I have used 
spark-submit --Jar option to specify all my jars and configs to be used by 
executors.

all these files are placed in working directory of executor, but not in 
executor classpath.  Also, executor working directory is not in executor 
classpath.

I am expecting executor to find all files in spark-submit --jar options to be 
available.

in spark 1.3.0 executor working directory is in executor classpath.

To successfully run my application with spark 1.3.1 +, i have to use  following 
option  (conf/spark-defaults.conf)

spark.executor.extraClassPath   .

Please advice.

  was:
I ran into an issue that executor not able to pickup my configs/ function from 
my custom jar in standalone (client/cluster) deploy mode. I have used 
spark-submit --Jar option to specify all my jars and configs to be used by 
executors.

all these files are placed in working directory of executor, but not in 
executor classpath.  Also, executor working directory is not in executor 
classpath.

I am expecting executor to find all files in spark-submit --jar options to be 
available.

in spark 1.3.0 executor working directory is in executor classpath.

To successfully run my application with spark 1.3.1 +, i have to add following 
entry in slaves conf/spark-defaults.conf

spark.executor.extraClassPath   .

Please advice.


> Spark 1.3.1 and 1.4.0 doesn't put executor working directory on executor 
> classpath
> ----------------------------------------------------------------------------------
>
>                 Key: SPARK-8622
>                 URL: https://issues.apache.org/jira/browse/SPARK-8622
>             Project: Spark
>          Issue Type: Bug
>          Components: Deploy
>    Affects Versions: 1.3.1, 1.4.0
>            Reporter: Baswaraj
>
> I ran into an issue that executor not able to pickup my configs/ function 
> from my custom jar in standalone (client/cluster) deploy mode. I have used 
> spark-submit --Jar option to specify all my jars and configs to be used by 
> executors.
> all these files are placed in working directory of executor, but not in 
> executor classpath.  Also, executor working directory is not in executor 
> classpath.
> I am expecting executor to find all files in spark-submit --jar options to be 
> available.
> in spark 1.3.0 executor working directory is in executor classpath.
> To successfully run my application with spark 1.3.1 +, i have to use  
> following option  (conf/spark-defaults.conf)
> spark.executor.extraClassPath   .
> Please advice.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to