[ 
https://issues.apache.org/jira/browse/SPARK-8622?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14600942#comment-14600942
 ] 

Sean Owen commented on SPARK-8622:
----------------------------------

I don't think that is intended or even reasonable behavior. This mechanism is 
for transferring JARs to put on the classpath, not putting arbitrary files on 
the executor.

> Spark 1.3.1 and 1.4.0 doesn't put executor working directory on executor 
> classpath
> ----------------------------------------------------------------------------------
>
>                 Key: SPARK-8622
>                 URL: https://issues.apache.org/jira/browse/SPARK-8622
>             Project: Spark
>          Issue Type: Bug
>          Components: Deploy
>    Affects Versions: 1.3.1, 1.4.0
>            Reporter: Baswaraj
>
> I ran into an issue that executor not able to pickup my configs/ function 
> from my custom jar in standalone (client/cluster) deploy mode. I have used 
> spark-submit --Jar option to specify all my jars and configs to be used by 
> executors.
> all these files are placed in working directory of executor, but not in 
> executor classpath.  Also, executor working directory is not in executor 
> classpath.
> I am expecting executor to find all files specified in spark-submit --jar 
> options .
> In spark 1.3.0 executor working directory is in executor classpath, so app 
> runs successfully.
> To successfully run my application with spark 1.3.1 +, i have to use  
> following option  (conf/spark-defaults.conf)
> spark.executor.extraClassPath   .
> Please advice.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to